Butterfly-inspired AI technology for multi-sensory decision making

The platform is both more advanced and uses less energy than other AI technologies.

Share

When it comes to mating, the look and the smell of their potential partner are important for Heliconius butterflies. Despite having small brains, they have to process both sensory inputs simultaneously. They are better at multi-sensory decision-making than current AI technologies, which require significant energy consumption to achieve similar results.

To close this gap, a team of researchers from Penn State has developed a more advanced and energy-efficient multi-sensory AI platform. This breakthrough could be a game-changer for robotics and smart sensors in detecting potential dangers like faulty structures or chemical leaks.

“If you think about the AI we have today, we have very good image processors based on visual or excellent language processors that use audio,” said Saptarshi Das, associate professor of engineering science and mechanics and corresponding author of the study. “But when you think about most animals and also human beings, decision-making is based on more than one sense. While AI performs quite well with a single sensory input, multi-sensory decision making is not happening with the current AI.”

Heliconius butterflies have a unique way of choosing their mate. They use a combination of visual and chemical cues of pheromones released by the other butterfly to determine if the potential mate is a Heliconius butterfly. What’s even more impressive is that they do this with a tiny brain that uses minimal energy.

This is quite different from modern computing, which consumes a significant amount of energy. These tiny creatures can perform complex computational tasks relying on multiple sensory inputs simultaneously.

Scientists have found a way to mimic the behavior of butterflies electronically, using 2D materials. The hardware platform developed by the researchers is made up of two such materials – molybdenum sulfide (MoS2) and graphene. MoS2 is a memtransitor, which can perform both memory and information processes, and was chosen for its light-sensing capabilities that mimic the visual abilities of the butterfly. On the other hand, graphene serves as a chemitransistor that can detect chemical molecules and replicate the pheromone detection of the butterfly’s brain.

“The visual cue and the pheromone chemical cue drive the decision whether that female butterfly will mate with the male butterfly or not,” said co-author Subir Ghosh, second year doctoral student in engineering science and mechanics. “So, we got an idea inspired by that, thinking how we have 2D materials with those capabilities. The photoresponsive MoS2 and the chemically active graphene could be combined to create a visuochemical-integrated platform for AI and neuromorphic computing.”

The experiment conducted by the researchers involved exposing their dual-material sensor to different colored lights and applying solutions with varying chemical compositions, similar to the pheromones released by butterflies. They wanted to test how well their sensor could integrate information from both the photo detector and chemisensor, similar to how a butterfly’s mating success relies on matching wing color and pheromone strength.

Based on the output response, the researchers were able to conclude that their devices could seamlessly integrate visual and chemical cues. This presents a great potential for the sensor to process and interpret diverse types of information simultaneously.

“We also introduced adaptability in our sensor’s circuits, such that one cue could play a more significant role than the other,” said Yikai Zheng, a fourth-year doctoral student in engineering science and mechanics and co-author of the study. “This adaptability is akin to how a female butterfly adjusts her mating behavior in response to varying scenarios in the wild.”

According to researchers, dual sensing in a single device is more energy efficient than the current way AI systems operate. Instead of collecting data from different sensor modules and then shuttling it to a processing module, the new method allows for two senses to be integrated into a single device. This not only reduces delays but also minimizes excessive energy consumption.

Next, the researchers plan to expand their device to integrate three senses, like how crayfish use visual, tactile, and chemical cues to sense prey and predators. The ultimate goal is to develop hardware AI devices that can handle complex decision-making scenarios in diverse environments.

“We could have sensor systems in places such as a power plant that would detect potential issues such as leaks or failing systems based on multiple sensory cues,” Ghosh said. “Such as a chemical odor, a change in vibration, or detecting weaknesses visually. This would then better help the system and staff determine what they need to do to fix it quickly because it was not just relying on one sense, but multiple ones.”

Journal reference:

  1. Yikai Zheng, Subir Ghosh, Saptarshi Das. A Butterfly-Inspired Multisensory Neuromorphic Platform for Integration of Visual and Chemical Cues. Advanced Materials, 2023; DOI: 10.1002/adma.202307380

Trending