EmoNet: A computer system that knows how you feel

Could a computer, at a glance, tell the difference between a joyful image and a depressing one?

Scholars have proposed that feelings are canonical responses to circumstances genealogically connected to survival. Provided that if this is true, then feelings or emotions might be managed by features of the sensory environment. Nonetheless, a couple of computational models depict how combinations of stimulus features summon various feelings.

In new research, scientists at CU Boulder develop a convolutional neural network to precisely decodes images into 11 distinct emotion categories. The technology is a step forward in the use of ‘neural networks’- computer systems demonstrated after the human brain —to the investigation of emotion.

Lead author Philip Kragel, a postdoctoral research associate at the Institute of Cognitive Science said, “A lot of people assume that humans evaluate their environment in a certain way and emotions follow from specific, ancestrally older brain systems like the limbic system. We found that the visual cortex itself also plays an important role in the processing and perception of emotion.”

The model, dubbed as EmoNet, is retooled from the neural network called AlexNet, which enables computers to recognize objects. EmoNet’s goal is to provide a plausible account of how visual information is linked to distinct types of emotional responses.

The model can accurately and consistently categorize 11 different types of emotions. However, it was greater at perceiving some than others. For example, it distinguished photographs that evoke craving or sexual desire with more than 95 percent exactness. Be that as it may, it had a harder time with more nuanced emotions like confusion, awe, and surprise.

Even a simple color elicited a prediction of emotion: When EmoNet saw a black screen, it registered anxiety. Red conjured craving. Puppies evoked amusement. If there were two of them, it picked romance. EmoNet was also able to reliably rate the intensity of images, identifying not only the emotion it might be illicit but how strong it might be.

During the experiments, scientists showed EmoNet brief movie clips and requested that it classify them as romantic comedies, activity movies or blood and horror movies, it got it right three-quarters of the time.

The study was conducted on 18 human participants. fMRI technique was used to measure their brain activity. When scientists show them flashes of 112 images for 4 seconds, the system saw the same pictures, mainly serving as the 19th subject. Scientists then compared the activity in the neural network to that in the subjects’ brains, the patterns matched up.

Kragel said, “We found a correspondence between patterns of brain activity in the occipital lobe and units in EmoNet that code for specific emotions. This means that EmoNet learned to represent emotions in a biologically plausible way, even though we did not explicitly train it to do so.”

The brain imaging itself also yielded some surprising findings. Even a brief, basic image – an object or a face – could ignite emotion-related activity in the visual cortex of the brain. And different kinds of emotions lit up in different regions.

Senior author Tor Wager, who worked on the study while a professor of psychology and neuroscience at CU Boulder said, “This shows that emotions are not just add-ons that happen later in different areas of the brain. Our brains are recognizing them, categorizing them, and responding to them very early on.”

The study is published Wednesday in the journal Science Advances.

Latest Updates

Trending