2019
DOI: 10.1063/1.5108650
|View full text |Cite
|
Sign up to set email alerts
|

Multilevel HfO2-based RRAM devices for low-power neuromorphic networks

Abstract: Training and recognition with neural networks generally require high throughput, high energy efficiency, and scalable circuits to enable artificial intelligence tasks to be operated at the edge, i.e., in battery-powered portable devices and other limited-energy environments. In this scenario, scalable resistive memories have been proposed as artificial synapses thanks to their scalability, reconfigurability, and high-energy efficiency, and thanks to the ability to perform analog computation by physical laws in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
139
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
9

Relationship

2
7

Authors

Journals

citations
Cited by 154 publications
(141 citation statements)
references
References 24 publications
2
139
0
Order By: Relevance
“…4a. In such method, pulses of incremental amplitude are applied to the devices (Write) until the required conductance is reached (Verify) [56]. If the target conductance is exceeded, then increasing pulses with the opposite polarity are applied in a similar fashion to gradually reach the target conductance value (within an error margin).…”
Section: Lrs Hrsmentioning
confidence: 99%
See 1 more Smart Citation
“…4a. In such method, pulses of incremental amplitude are applied to the devices (Write) until the required conductance is reached (Verify) [56]. If the target conductance is exceeded, then increasing pulses with the opposite polarity are applied in a similar fashion to gradually reach the target conductance value (within an error margin).…”
Section: Lrs Hrsmentioning
confidence: 99%
“…The tasks can be split into two parts: the first one comprises a set of MATLAB sub-routines for creating, training and writing the SPICE netlist for an ideal feed-forward ANN, while the second part relates to the SPICE simulation of the proposed circuit during the inference phase. It is worth mentioning that although a simpler approach than the more complex RRAM based neural networks explored in the literature (Multi-layer Perceptron [56], [59], [60], Convolutional Neural Networks [61], Spike Neural Networks [62], etc., see Supplementary Table I), the SLP allows studying and clarifying the ANN limitations caused by parasitic effects and non-idealities occurring in the synaptic layers implemented with CPAs, as well as benchmarking the computational costs of the QMM based simulations against other available models. Regarding the MATLAB-implemented part of the procedure, the first step consists in creating the image (n × n MATLAB SPICE Starting with the image size specification, RW , V read , and connection scheme, the routine creates the database, trains the ANN (single layer perceptron), translates it into a CPA, adds the peripheral control circuit, performs the simulations and processes the results.…”
Section: Procedures For Memdiode Cpa Creation Training and Simulmentioning
confidence: 99%
“…Moreover, an alternative approach aiming at combining high performance with high energy efficiency was proposed in ref. [140]. Here, after an off-line training resulting in the optimization of synaptic weights in the software, the floating-point accuracy of synaptic weights was reduced only to five levels, which were stored in a hardware 4 kbit HfO 2 RRAM array using a novel multilevel programming scheme.…”
Section: Dnns With Memristive Synapsesmentioning
confidence: 99%
“…The same set of MNIST data was also recently used for the in-situ supervised training of 1T1R RRAM synaptic crossbars of about 8000 memristors, showing high accuracy and tolerance to defective devices (stuck at low conductance) and reaching high recognition accuracy [14]. Also remarkable is the recent demonstration of the ex-situ training of a two-layer perceptron DNN implemented with a 4Kbit 1T1R RRAM array which not only achieved high accuracy but also very low power consumption [15].…”
Section: Artificial Intelligence and Its Hardware Implementationmentioning
confidence: 99%