Two professors involved in the intersection of artificial intelligence and mental health shared their work Friday evening at the Ann Arbor District Library in partnership with the University of Michigan’s AI Laboratory

Emily Mower Provost, associate professor of computer science and electrical engineering, and Melvin McInnis, professor of bipolar disorder and depression, are working together to develop computational methods for measuring mood symptom severity in bipolar disorder. McInnis is the director of the Heinz C. Prechter Bipolar Research Program, and Provost is a member. 

The panelists first discussed why they were involved in the project. Provost, who was always interested in human behavior and improving people’s lives through engineering, said she was excited to work at the intersection of human-robot interaction and effective computing. 

“It gives me an opportunity not only to try to create new and really innovative algorithms, but when you put a human-centered swing into AI, then you also have the opportunity to really join engineering and science,” Provost said. “To me, it seemed like a really interesting opportunity to do something meaningful to do engineering that had an impact on people’s lives. … We specialized and started working in emotion recognition, where the goal was to take in speech and try to quantify ambiguity that’s associated with how people express emotions, which was exciting.” 

McInnis, who has been interested in bipolar disorder for more than 30 years, said he hoped to teach a computer to detect patients’ changes in emotion based on their speech. As a physician, he said he meets family members who can detect high-risk periods for bipolar patients through their speech. 

“The family members say, ‘There was something in their voice, 10 days ago, there was something different and something changed,’” he said. “Our work is to identify biological markers that are physiological markers that are in speech. How can we teach the computer to do what the family member is doing? How can we develop objective measures that will be useful in identifying high-risk periods for individuals with mood disorders, such as bipolar disorder?”

The University and the Heinz C. Prechter Bipolar Research Program recently received a $5.8 million gift to go toward bipolar disorder research. McInnis and Provost addressed some of the key challenges to creating an application or device to track speech patterns for bipolar disorder, the first being patient compliance. McInnis said the device will need to be a passive device, so patients and family members do not need to worry about tracking anything themselves. 

“It needs to be something that could be the silent monitor,” McInnis said. “For the individual with the long-term chronic disease, it’s not just monitoring an individual for six months or one year, but it’s five years. So, what happens when somebody is going along and, all of a sudden, you realize things are starting to go down a bit, your device can give an alert and say, ‘Maybe you should talk to your doctor soon.’ You can share this information with your care team, with your support network, so that you can be part of a team that’s helping you stay healthy longer.”

Provost said another challenge with AI for emotional recognition is cultural differences in the way different people express themselves. 

“The use of the smile, of course, is different across different cultures,” Provost said. “People are differently willing to express their emotions … So really thinking about how your models are able to understand, what is the baseline for a given individual? Because it’s not just the culture that influences how people’s expression patterns are different, with aspects of personality and temperament.” 

The panelists also discussed the ethics of data privacy and of using patient data to study human behavior in mood disorders. McInnis argued this research must be done to improve patients’ lives. 

“From an ethical standpoint, it would be unethical not to do this research,” he said. “The driving factor of it is I work with patients with bipolar disorder … up to 20 percent of these individuals end their lives by suicide. So, someone who goes into a manic state, literally dancing on the table telling everyone what they’re doing, they can be driven to do behaviors that have personal, social and vocational consequences. And they will say, ‘Listen, privacy, if I get into a manic state, my privacy is gone. Everybody knows what’s happening to me, they can see what I do. I have no privacy. If there’s a chance this would help me, I’m in.’”

He added patients trust the University to conduct ethical and safe research. 

“We, the scientific community, have to earn the trust of the people that we serve,” McInnis said. “When an individual comes to my office today in a clinical context and we say, ‘Are you interested in participating in this research?’ and he thinks about it for five seconds and says, ‘Since I trust you, I’m going to do this.’ He’s doing it because he trusts the University of Michigan to do the right thing and to be ethical and to really be sure that when we say something, that we have the knowledge and the insurance that we’ve done the very best.”

Provost said she hoped AI would learn to improve itself over time, so it could more accurately predict high-risk periods for patients. She also highlighted the importance of AI and humans working together, emphasizing AI would not be a replacement for health care, physicians and nurses. 

“The goal of all of these systems is to understand what health looks like and understand how to detect deviations,” Provost said. “Then you have tools to help in a domain like depression, or you have tools to help you in a domain like post-traumatic stress disorder or anxiety or stress. And the question always is, how can you take systems that work well in one space and change them, adapt them with new data, with a new understanding for what patterns look like, and make them effective in another space as well? We hope for AI systems to lead to the treatments and discovering of new signals of depression or bipolar disorder.” 

Engineering senior Kartik Pandit came to the event because he is taking a course focused on software for accessibility. Pandit wanted to learn more about how AI can help with cognitive behavioral therapy and measuring emotions, which can be difficult because of its more ambiguous biological markers.

“In other fields, you have hard evidence; you have X-rays you can use to detect cancer and so on,” Pandit said. “What are those features that you would use to identify bipolar disorder or mental health issues? I kind of got my answer because what they’re using is speech, but also emotion is as hard to quantify as I thought it would be, so they’re looking for things that are more quantifiable like heart rate and sleep patterns.” 

LSA senior Danielle Newport, who came with Pandit, said AI can work with humans to improve the mental health care system, but not replace humans. 

“I was curious what it even means for AI to work in joint with mental health,” Newport said. “The idea of AI in medicine in general and specifically in mental health, it’s supposed to be a guiding factor and not replace any clinician. AI is a tool and not something that should necessarily be a stand-in for a human. There will always be a need for humans with the technology.”

 

Leave a comment

Your email address will not be published. Required fields are marked *