A new synaptic transistor mimics human intelligence

Transistor performs energy-efficient associative learning at room temperature.

Share

Moiré quantum materials, formed by stacking two-dimensional layers specifically, exhibit unique electronic behaviors due to enhanced internal interactions. Combined with the robust electrostatic control possible in fragile materials, these structures could pave the way for highly functional electronic devices. However, the challenge has been that these moiré electronic phenomena are only observed at extremely low temperatures, making them impractical for everyday use in real-world applications.

Researchers from Northwestern University, Boston College, and MIT have developed a new synaptic transistor capable of higher-level thinking.

This device mimics the human brain by processing and storing information simultaneously. In recent experiments, the transistor showcased capabilities beyond essential machine learning, successfully categorizing data and demonstrating associative learning.

This device’s stability at room temperature sets it apart from previous brain-like computing devices that required extremely low temperatures. Additionally, it operates at high speeds, consumes minimal energy, and retains stored information even when not powered, making it well-suited for practical, real-world applications.

Northwestern’s Mark C. Hersam co-led the research and said, “The brain has a fundamentally different architecture than a digital computer. In a digital computer, data move back and forth between a microprocessor and memory, which consumes a lot of energy and creates a bottleneck when attempting multiple tasks simultaneously.”

“On the other hand, memory and information processing are co-located and fully integrated in the brain, resulting in higher energy efficiency. Our synaptic transistor similarly achieves concurrent memory and information processing functionality to mimic the brain more faithfully.”

The progress in artificial intelligence (AI) has spurred researchers to create computers that function more similarly to the human brain. Traditional digital computing systems have separate units for processing and storage, leading to high energy consumption for data-intensive tasks. As smart devices accumulate large amounts of data, researchers are exploring innovative approaches to process this data efficiently without a substantial increase in power consumption.

Currently, the most advanced technology for combining processing and memory functions is the memory resistor, or “memristor.” However, memristors still face challenges related to energy-intensive switching processes.

Hersam said, “For several decades, the paradigm in electronics has been to build everything out of transistors and use the same silicon architecture. Significant progress has been made by simply packing more and more transistors into integrated circuits.”

“You cannot deny the success of that strategy, but it comes at the cost of high power consumption, especially in the current era of big data where digital computing is on track to overwhelm the grid. We must rethink computing hardware, especially for AI and machine-learning tasks.”

Researchers have delved into the physics of moiré patterns, which emerge when two patterns are layered on top of each other. By stacking and twisting two-dimensional materials like bilayer graphene and hexagonal boron nitride, a moiré pattern is formed, unlocking unique electronic properties. By adjusting the twist between the layers, researchers achieved different electronic behaviors in each graphene layer, demonstrating neuromorphic functionality at room temperature by harnessing moiré physics in the new device.

Hersam said, “With twist as a new design parameter, the number of permutations is vast. Graphene and hexagonal boron nitride are structurally similar but different enough for powerful moiré effects.”

Researchers trained the transistor to recognize similar — but not identical — patterns.

The researchers trained the AI using a specific pattern, say 000 (three zeros in a row), and then asked it to identify similar patterns, like 111 or 101. The AI learned from the initial pattern that 111 is more similar to 000 than 101. This ability to identify similarities between different patterns is a higher-level cognitive function known as associative learning.

In experiments, the newly developed synaptic transistor demonstrated its ability to recognize similar patterns, showcasing its associative memory. Even with incomplete patterns, it still successfully showed associative learning, indicating its adaptability and robust performance.

Hersam said, “Current AI can be easy to confuse, which can cause major problems in certain contexts. Imagine if you are using a self-driving vehicle, and the weather conditions deteriorate. The vehicle might not be able to interpret the more complicated sensor data as well as a human driver could. But even when we gave our transistor imperfect input, it could still identify the correct response.”

Journal Reference:

  1. Yan, X., Zheng, Z., Sangwan, V.K. et al. Moiré synaptic transistor with room-temperature neuromorphic functionality. Nature 624, 551–556 (2023). DOI: 10.1038/s41586-023-06791-1

Trending