Researchers developed an AI computer application that analyzes a patient’s speech patterns to monitor their mental health
Credit: UCLA Health
A new study finds that an interactive voice application using artificial intelligence is an effective way to monitor the wellbeing of patients being treated for serious mental illness.
Researchers from UCLA followed 47 patients for up to 14 months using an application called MyCoachConnect. All of the patients were being treated by physicians for serious mental illnesses, including bipolar disorder, schizophrenia and major depressive disorder.
For the study, published in PLOS ONE, participants called a toll-free number one or two times a week and answered three open-ended questions when prompted by a computer-generated voice. The questions were: How have you been over the past few days?; What’s been troubling or challenging over the past few days?; and What’s been particularly good or positive?
MyCoachConnect was designed to collect personalized patient responses, said lead author Dr. Armen Arevian, director of the Innovation Lab at the Jane and Terry Semel Institute for Neuroscience and Human Behavior. Specifically, the AI was trained to use an individual’s own words to offer a personalized analysis for each patient. The application focused primarily on the choice of words the patients used in their responses, how their responses changed over time, with a smaller emphasis on audio features like tone of voice.
The analysis of the data, conducted in collaboration with researchers from USC’s Signal Analysis and Interpretation Laboratory (SAIL), found that the application’s analysis was as accurate at monitoring patients’ mental states as their treating physicians
“The way people answer questions and the way they change their answers over time is unique to each patient,” Arevian said. “We were looking at a person as a person and not as a diagnosis.”
For the study, patients made calls either from a mobile phone, landline, or pay phone, and were asked to speak for two to three minutes for each question.
“Technology doesn’t have to be complicated,” Arevian said. “In this study, patients didn’t need a smartphone or any a phone at all. It could be simple and low tech on the patient end, and high tech on the backend.”
Researchers hope that artificial intelligence that can analyze data collected from apps such as MyCoachConnect will enable more proactive and personalized care for individuals. The application, for example, may help improve treatment by intervening early when someone is experiencing more symptoms.
“Artificial intelligence allowed us to illuminate the various clinically-meaningful dimensions of language use and vocal patterns of the patients over time and personalized at each individual level” said senior author Dr. Shri Narayanan, Niki and Max Nikias Chair in Engineering and Director of SAIL at the USC Viterbi School of Engineering.
Some participants were interviewed after the study ended, and said they found the system easy and enjoyable to use, Arevian said.
“They said speaking to a computer-generated voice allowed them to speak more freely,” Arevian said. “They also said it helped them feel less lonely because they knew that someone would be listening to it, and to them that meant that someone cared.”
MyCoachConnect was developed and hosted on the Chorus platform, which was developed by Arevian at UCLA and allows people to visually create mobile and other computer applications without computer programming in as little as a few minutes.
###
Media Contact
Marrecca Fiore
[email protected]
310-267-7095