A new discovery by Yale University could advance understanding the human visual system to improve a host of artificial intelligence efforts. By using precise brain measurements, scientists predicted how people’s eyes move while seeing natural scenes.
Many scientists have studied eye movements previously. Scientists can tell with some certainty where a gaze will be directed at different elements in the environment. But, it is still unclear that how the brain orchestrates this ability, which is so fundamental to survival.
In the previous study of ‘mind reading’, scientists reconstructed facial images viewed while people were scanned in an MRI machine, based on their brain imaging data alone.
In a new study, scientists took a similar approach and showed that by analyzing the brain responses to complex, natural scenes, they could predict where people would direct their attention and gaze. For this, scientists analyzed the brain data with deep convolutional neural networks — models that are extensively used in artificial intelligence (AI).
Marvin Chun, Richard M. Colgate Professor of Psychology, professor of neuroscience said, “The work represents a perfect marriage of neuroscience and data science. It has a myriad of potential applications — such as testing competing for artificial intelligence systems that categorize images and guide driverless cars.”
“People can see better than AI systems can. Understanding how the brain performs its complex calculations is an ultimate goal of neuroscience and benefits AI efforts.”
Scientists have reported about their discovery in the journal Nature Communications.