AI-powered phone app detects depression from facial expression

MoodCapture app opens the door to real-time digital mental health support.

Share

People with major depression may find it hard to assess how severe their symptoms are. In the near future, a smartphone app that uses artificial intelligence to analyze facial expressions could help them monitor their mental health.

A team of scientists at New Hampshire’s Dartmouth College has developed a smartphone application called MoodCapture that can detect depression before the user even realizes they might be experiencing it.

The app uses artificial intelligence and facial-image processing software to analyze facial expressions and surroundings during regular phone use, looking for clinical cues associated with depression.

The concept behind this app is that every time the user unlocks their phone using the facial recognition system, the front camera of the device captures multiple photos of their face and surroundings. An AI-driven algorithm then assesses these images, analyzing the user’s facial expression along with the background images.

If the app detects that the user’s depression is worsening, it offers suggestions such as getting outdoor exercise or spending time with loved ones. The app aims not to sound like a strict warning that the user should seek psychiatric attention. Instead, it encourages users to take positive measures that may help them feel better without worsening their depression.

In a study with 177 people diagnosed with major depressive disorder, the app correctly identified early symptoms of depression with 75% accuracy.

Over a period of 90 days, participants’ phones took photos of them as they rated their agreement with the statement “I have felt down, depressed, or hopeless.” The prompt is from the eight-point Patient Health Questionnaire (PHQ-8) commonly used to evaluate depression.

The participants were unaware that their phones were taking their pictures while responding to the questionnaire, which eliminates the possibility of them subconsciously masking their emotions.

When a total of 125,000 photos were subsequently analyzed, the AI identified the facial expressions (in some subsets) that coincided with the most emphatic agreements to the prompt. Such expressions included variations in gaze direction, eye movement, positioning of the head, and muscle rigidity. It also identified recurring environmental factors, such as dominant colors, lighting, photo locations, and the number of people in the image.

The app is 75% accurate and expected to reach at least 90% within five years. The advantage of MoodCapture is that it allows patients to assess their illness more frequently, responding to downswings before they progress too far.

“This demonstrates a path toward a powerful tool for evaluating a person’s mood in a passive way and using the data as a basis for therapeutic intervention,” says Campbell, noting that an accuracy of 90% would be the threshold of a viable sensor. “My feeling is that technology such as this could be available to the public within five years. We’ve shown that this is doable.”

Journal reference:

  1. Subigya Nepal, Arvind Pillai, Weichen Wang, Tess Griffin, Amanda C. Collins, Michael Heinz, Damien Lekkas, Shayan Mirjafari, Matthew Nemesure, George Price, Nicholas C. Jacobson, Andrew T. Campbell. MoodCapture: Depression Detection Using In-the-Wild Smartphone Images. arXiv, 2024; arXiv:2402.16182

Trending