2023
DOI: 10.1039/d3nr02780e
|View full text |Cite
|
Sign up to set email alerts
|

A bi-functional three-terminal memristor applicable as an artificial synapse and neuron

Lingli Liu,
Putu Andhita Dananjaya,
Calvin Ching Ian Ang
et al.

Abstract: In this work, a gate-controlled memristor that enables synaptic and neuronal bi-functionality is proposed, which enhances neural network hardware implementation efficiently with all standard CMOS techniques used for device fabrication.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
4

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 60 publications
(84 reference statements)
0
3
0
Order By: Relevance
“…11,12 Concurrently, non-filamentary memristors have also been explored in neuromorphic research. 13–15 The analog nature of such memristive devices, due to their gradual resistive switching characteristics, aligns well with the intricacies of the human brain, allowing for the creation of artificial neural networks that closely mimic the behavior of biological synapses. The synaptic weights, also known as the connection strength between the neurons, are represented by the conductance of each device, and the analog properties enable different weights to be stored via a step-by-step weight adjustment during the neural network training process.…”
Section: Introductionmentioning
confidence: 89%
“…11,12 Concurrently, non-filamentary memristors have also been explored in neuromorphic research. 13–15 The analog nature of such memristive devices, due to their gradual resistive switching characteristics, aligns well with the intricacies of the human brain, allowing for the creation of artificial neural networks that closely mimic the behavior of biological synapses. The synaptic weights, also known as the connection strength between the neurons, are represented by the conductance of each device, and the analog properties enable different weights to be stored via a step-by-step weight adjustment during the neural network training process.…”
Section: Introductionmentioning
confidence: 89%
“…The artificial neural network is composed of multiple layers of neurons, and the adjacent neurons are connected through artificial synapses, as illustrated in Figure d, and the weight is updated by changing the connection strength. ,, The weight value should be modulated continuously and be nonvolatile. Moreover, the variations of Hall resistance, which are continuously tuned by programming consecutive pulse sequences, are desired to imitate synaptic behavior.…”
mentioning
confidence: 99%
“…Conventional computers use the von Neumann architecture where the main memories are physically and functionally separated from the central processing unit (CPU). The mismatch in processing speed and data transfer rate between the CPU and memory constrains the operational efficiency of conventional computers. Inspired by the human brain of storing and processing information concurrently, researchers exploit novel hardware architectures based on emerging nonvolatile memories to implement logic-in-memory architecture. , Compared with other memories, magnetoresistive random-access memory (MRAM) has become one of the most popular candidates for in-memory computing because of its low power consumption, infinite endurance, and nonvolatility. The conventional MRAM device based on magnetic tunneling junction (MTJ) has binary states, which is not efficient for operating the multistate storage. , To achieve the multiple states, researchers have attempted to integrate multi-MTJ pillars into a single write-line. , Controlling the pillar individually switches to get multiple states, but this requires more MTJ pillars, which greatly increases the bit cell size . Another approach is to control the DW motion in the free layer of MTJ, causing the parallel and antiparallel composition change between the free layer and fixed layer to tune the output resistance. , Recently, the MTJ devices utilizing DW positions for in-memory computing, artificial synapse, and spiking neuron functionalities have been successfully demonstrated. However, this method requires larger device sizes and complex device structures to produce multiple DW locations.…”
mentioning
confidence: 99%