April 15, 2024

2D materials drive biorealism in computing

THE DOCUMENT IN BRIEF

• The brain offers ample inspiration for computer engineers, but these “neuromorphic” devices can be hampered by enormous power consumption, limited endurance, and considerable variability.

• One of the challenges associated with optimizing these devices involves determining which brain characteristics to emulate.

• In an article in NatureYan et al.1 report a type of synaptic transistor, a device named for its similarities to neuronal connections known as synapses, that maximizes performance through a ratchet mechanism reminiscent of how neurons strengthen their synapses.

• The transistor could enable energy-efficient artificial intelligence algorithms and reproduce some of the brain’s many sophisticated behaviors.

FRANK HL KOPPENS: A new twist on synaptic transistors

At the heart of Yan and his colleagues’ innovation is the unusual behavior of electrons that arises when single-atom-thick materials are stacked and then twisted around each other. The materials in question are bilayer graphene, which comprises two stacked layers of carbon atoms, sandwiched by two layers of the dielectric material hexagonal boron nitride (hBN).2. Both materials have hexagonal crystal structures, but the spacing between their atoms differs slightly. The overlapping hexagonal patterns create regions of constructive and destructive interference, resulting in a larger scale pattern known as a moiré lattice.

The moiré pattern modifies the way electrons are distributed in bilayer graphene: it localizes them periodically throughout the crystal lattice (Fig. 1a). The electrons in the top layer of graphene are more affected by this periodic modulation because the crystal structure of this layer is aligned with that of the hBN on top, and this essentially immobilizes the electrons. In contrast, the hBN beneath the bottom graphene layer misaligns with the graphene, resulting in weaker electronic modulation.3. Therefore, the electrons in this layer are mobile and contribute to the flow of current.

Figure 1 | A transistor that imitates a biological synapse.Yan et al.1 constructed a device comprising two layers of graphene (each a single sheet of carbon atoms) and the dielectric material hexagonal boron nitride (hBN). toThe device is called a moiré synaptic transistor because it shares similarities with the synaptic connections between neurons and because a “moiré” pattern forms between the overlapping hexagonal crystal structures of the top layer of graphene and hBN. This pattern localizes the electrons in the top layer of graphene, but those in the bottom layer remain mobile. Applying a voltage pulse to the top gate (a component that regulates the number of electrons in the graphene system; not shown) results in a ratchet effect, whereby an electrical current is increased with successive pulses. b This effect is reminiscent of the way that repeated electrical stimulation can strengthen synapses by enriching protein complexes called AMPA receptors, improving the effectiveness of neurotransmitters and increasing ion flow.

This interlayer asymmetry causes the transistor to function as a kind of ratchet, controlling the flow of mobile electrons and regulating the electrical conductance of the device, which is analogous to synaptic strength. The ratchet is controlled by two “gates” above and below the structure, which regulate the number of electrons in the graphene system. When a voltage pulse is applied to the top gate, the initial voltage increase adds immobile electrons to the top layer of graphene. And when the energy levels of the electrons in this layer fill, mobile electrons are added to the bottom layer of graphene. A subsequent decrease in voltage removes electrons from the top layer of graphene, but mobile electrons from the bottom layer remain. In this way, the voltage impulse modifies the conductance in a way reminiscent of the strengthening of a synaptic connection.

What distinguishes Yan and his coworker’s moiré device from existing synaptic transistors is that it can be easily tuned, a feature that shares similarities with synaptic behavior observed in biological neural networks. This makes the transistor ideal for advanced artificial intelligence (AI) applications, particularly those involving ‘computing in memory’ designs that integrate processing circuits directly into the memory die, to maximize power efficiency. It could also allow information to be processed on devices located at the edge of a network, rather than in a centralized data center, thus improving data security and privacy.

Although the authors’ transistor represents an important advance, it is not without limitations. For example, stacking ultrathin materials requires sophisticated manufacturing processes, making it difficult to scale up the technology for widespread industrial use. On a positive note, methods already exist to grow large-area bilayer graphene.4 and hBN5, up to the typical sizes of 200 or 300 millimeters used in the silicon industry. This sets the stage for an ambitious but timely effort: fully automated robotic assembly of large-area moiré materials.

If achieved, this would make Yan and his colleagues’ device easier to manufacture and unlock other moiré materials innovations, such as quantum sensors, non-volatile computer memories, and energy storage devices. It would also bring us closer to integrating moiré synaptic transistors into larger, more complex neural networks, a crucial step toward realizing the full potential of these devices in real-world applications.

JAMES B. AIMONE AND FRANCES S. CHANCE:Capturing brain functionality

Yan and his collaborators’ breakthrough addresses a long-standing challenge at the intersection of neuroscience and computing: identifying which biophysical features of the immensely complicated brain are necessary to achieve functional neuromorphic computing and which can be ignored. The authors have managed to emulate a characteristic of the brain that is particularly difficult to realize: its synaptic plasticity, which describes the ability of neurons to control the strength of their synaptic connections.

Existing synaptic transistors can be connected together in grid-like architectures that mimic neural networks. But dynamic reprogramming of most of these devices remains unreliable or expensive, while the brain’s synapses can adapt reliably and robustly over time. Furthermore, even if biological mechanisms of synaptic plasticity could be implemented in an artificial system, it is still unclear how to harness these mechanisms to realize algorithms that can learn as biological systems do.

The authors’ moiré synaptic transistor brings the flexibility and control necessary for brain-like synaptic learning by providing a powerful way to tune its electrical conductance, an indicator of synaptic strength. The device’s asymmetric charge transfer mechanism is reminiscent of processes known as long-term potentiation and long-term depression, in which pulsed electrical stimulation has the effect of strengthening a synapse (or weakening it, in the case of depression). . The increase in charge carriers can be considered analogous to the enrichment of protein complexes, known as AMPA receptors, at synapses during long-term potentiation and depression.6 (Figure 1b).

Inspired by the behaviors observed in biological synapses, Yan et al. They demonstrated that their device could be used to train neuromorphic circuits in a more “brain-like” manner than had previously been achieved with artificial synapse devices. Although the two gates of the moiré synaptic transistor could simply be used to adjust synaptic strength (or electrical conductance) directly, in biology, the control of synaptic learning is more nuanced. The authors acknowledged that some aspects of this more precise control could also be done on their device.

Specifically, Yan et al. They were able to adjust the upper and lower gate voltages to make their synaptic moiré transistor exhibit input-specific adaptation, which is a phenomenon that allows a neuron to control its synaptic learning rates in response to the average input. This mechanism is used when the eye is deprived (of adequate lighting, for example) to help the brain remember a stored pattern when presented with a similar one.

The authors’ synaptic moiré transistor could emulate this mechanism when programmed with a learning rule known as the Bienenstock-Cooper-Munro (BCM) model.7, which sets a dynamically updated threshold for strengthening or weakening a synapse that depends on the history of the neuron. The BCM rule is an abstract algorithmic description of synaptic plasticity in the brain that has been linked to cognitive behaviors. By proving that his device can implement this rule, Yan and Alabama. have offered a path to recreating biorealistic plasticity in human-made hardware.

Their work provides an opportunity for the BCM learning rule to act as a Rosetta Stone between theoretical neuroscience (much of which is based on BCM and similar models) and next-generation neuromorphic computing. For example, the authors’ ingenious double-gate control could be used to achieve synaptic plasticity in the vestibulo-ocular reflex, the mechanism that stabilizes images on the retina when the head moves.8. It will be interesting to see what other models of plasticity can be expressed, such as spike timing-dependent plasticity, in which the strengthening of a synapse depends on the timing of stimulation.9.

Leave a Reply

Your email address will not be published. Required fields are marked *