Researchers have developed a brain-like computing device capable of learning by association.
Similar to how famous physiologist Ivan Pavlov conditioned dogs to associate a bell with food, researchers at Northwestern University and the University of Hong Kong managed to condition their circuit to associate light with light. pressure.
The research will be published on April 30 in the journal Nature communications.
The secret of the device lies in its new organic and electrochemical “synaptic transistors”, which simultaneously process and store information just like the human brain. Researchers have shown that the transistor can mimic the short- and long-term plasticity of synapses in the human brain, relying on memories to learn over time.
With its brain capacity, the new transistor and circuitry could potentially overcome the limitations of traditional computing, including their energy-wasting hardware and limited ability to multitask. The brain-like device also has higher fault tolerance, continuing to run smoothly even when some components fail.
“While the modern computer is exceptional, the human brain can easily outperform it in some complex and unstructured tasks, such as pattern recognition, motor control, and multisensory integration,” said Jonathan Rivnay of Northwestern, author principal of the study. “It’s thanks to the plasticity of the synapse, which is the cornerstone of the brain’s computing power. These synapses allow the brain to function in a highly parallel, fault-tolerant, and energy-efficient manner. In our work, we demonstrate an organic and plastic transistor that mimics the key functions of a biological synapse. “
Rivnay is an assistant professor of biomedical engineering at the McCormick School of Engineering at Northwestern. He co-led the study with Paddy Chan, associate professor of mechanical engineering at the University of Hong Kong. Xudong Ji, postdoctoral researcher in Rivnay’s group, is the first author of the article.
Problems with conventional computing
Conventional digital computer systems have separate processing and storage units, so data-intensive tasks consume large amounts of power. Inspired by the combined process of computation and storage in the human brain, researchers in recent years have sought to develop computers that function more like the human brain, with sets of devices that function as a network of neurons.
“The way our current computer systems work is that memory and logic are physically separate,” Ji said. “You do calculations and send that information to a memory unit. Then whenever you want to retrieve that information, you have to recall it. If we can bring these two separate functions together, we can save space and save money. energy costs. “
Currently, memory resistor, or “memristor”, is the most developed technology that can perform combined processing and memory functions, but memristors suffer from energy-intensive switching and less biocompatibility. These drawbacks led researchers to the synaptic transistor – specifically the organic electrochemical synaptic transistor, which operates with low voltages, continuously tunable memory, and high compatibility for biological applications. However, challenges exist.
“Even high performance organic electrochemical synaptic transistors require the write operation to be decoupled from the read operation,” Rivnay said. “So if you want to conserve memory, you have to disconnect it from the write process, which can further complicate integration into circuits or systems.”
How the synaptic transistor works
To overcome these challenges, the Northwestern and University of Hong Kong team optimized a conductive plastic material in the organic electrochemical transistor that can trap ions. In the brain, a synapse is a structure through which one neuron can transmit signals to another neuron, using small molecules called neurotransmitters. In the synaptic transistor, ions behave similarly to neurotransmitters, sending signals between terminals to form an artificial synapse. By keeping the stored data of the trapped ions, the transistor remembers previous activities, developing long-term plasticity.
The researchers demonstrated the synaptic behavior of their device by connecting unique synaptic transistors in a neuromorphic circuit to simulate associative learning. They integrated pressure and light sensors into the circuit and trained the circuit to associate the two unbound physical inputs (pressure and light) with each other.
Perhaps the most famous example of associative learning is Pavlov’s dog, which naturally drooled when it encountered food. After conditioning the dog to associate a ringing with food, the dog also began to drool upon hearing the sound of a bell. For the neuromorphic circuit, the researchers activated a voltage by applying pressure with finger pressure. To condition the circuit to associate light with pressure, the researchers first applied pulsed light from an LED bulb and then immediately applied pressure. In this scenario, the pressure is the food and the light is the bell. The corresponding sensors of the device have detected both inputs.
After one training cycle, the circuit made an initial connection between light and pressure. After five training cycles, the circuit significantly associated light with pressure. Light alone was capable of triggering a signal, or “unconditioned response.”
Because the synaptic circuit is made of soft polymers, like plastic, it can be easily fabricated on flexible sheets and easily integrated into soft and wearable electronics, smart robotics, and implantable devices that directly interfere with living tissue and even the brain.
“Although our application is a proof of concept, our proposed circuit can be further expanded to include more sensory inputs and be integrated with other electronic components to enable low power on-site computation,” Rivnay said. “Because it is compatible with biological environments, the device can interface directly with living tissue, which is essential for next-generation bioelectronics.”