Mathematical model revealed how the human brain processes visual information

Evolution wired human brains to act like supercomputers.

Share

When modeling perception, active inference is frequently used to combine noisy sensory measurements with prior expectations to estimate the structure of the external environment. Understanding vision, cognition, motor control, and social interaction requires a solid mathematical foundation. Although theoretical research has demonstrated how priors can be calculated from environmental statistics, it is possible for numerous competing encoding strategies to realize the neuronal instantiation of priors.

Using a data-driven approach, scientists extracted the brain’s representation of visual orientation and compared this with simulations from different sensory coding schemes.

Researchers from the Universities of Sydney, Queensland, and Cambridge created a special mathematical model that closely resembles how human brains function regarding reading vision in a study published in the journal Nature Communications. Everything needed to do Bayesian inference was included in the model.

A statistical technique called Bayesian inference uses new evidence and prior knowledge to generate educated guesses. For instance, you might assume something is a dog if you know what a dog looks like and you see a fuzzy animal with four legs.

People can naturally perceive their surroundings with amazing accuracy and speed, unlike robots, which can be defeated by straightforward CAPTCHA security measures when asked to locate fire hydrants in a panel of images.

The study’s senior investigator, Dr Reuben Rideaux, from the University of Sydney’s School of Psychology, said: “Despite the conceptual appeal and explanatory power of the Bayesian approach, how the brain calculates probabilities is largely mysterious.”

“Our new study sheds light on this mystery. We discovered that the basic structure and connections within our brain’s visual system are set up to allow it to perform Bayesian inference on the sensory data it receives.”

“What makes this finding significant is the confirmation that our brains have an inherent design that allows this advanced form of processing, enabling us to interpret our surroundings more effectively.”

The study’s results not only support previously held hypotheses concerning the use of Bayesian-like reasoning by the brain but also pave the way for future research and innovation where the brain’s inherent capacity for Bayesian inference can be exploited for useful applications that advance society.

Dr. Rideaux said, “Our research, while primarily focussed on visual perception, holds broader implications across the spectrum of neuroscience and psychology. By understanding the fundamental mechanisms that the brain uses to process and interpret sensory data, we can pave the way for advancements in fields ranging from artificial intelligence, where mimicking such brain functions can revolutionize machine learning, to clinical neurology, potentially offering new strategies for therapeutic interventions in the future.”

Journal Reference:

  1. Harrison, W.J., Bays, P.M. & Rideaux, R. Neural tuning instantiates prior expectations in the human visual system. Nat Commun 14, 5320 (2023). DOI: 10.1038/s41467-023-41027-w

Trending