2015
DOI: 10.1109/tnano.2015.2448554
|View full text |Cite
|
Sign up to set email alerts
|

Ultrahigh Density Memristor Neural Crossbar for On-Chip Supervised Learning

Abstract: Although there are many candidates for future computing systems, memristor-based neural crossbar are considered especially promising thanks to their low power consumption, high density and fault-tolerance. However, their implementation is still hindered by the limitations of CMOS neuron and learning cells. In this paper, we present a memristor-based neural crossbar (NC) that implements on-chip supervised learning. Instead of using a standard CMOS neuron, a simple CMOS inverter realizes the activation function.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
6
2
2

Relationship

3
7

Authors

Journals

citations
Cited by 43 publications
(16 citation statements)
references
References 33 publications
(64 reference statements)
0
15
0
Order By: Relevance
“…To show the former, we emulated a multi-layer perceptron system by feeding forward functions learned in a first layer to build the truth tables of linearly non-separable functions in subsequent layers 40 . Similarly to 44 , which showed a multi-layer memristive system can learn AND and NOT in a first layer with two SNUs and subsequently the 2-bit XOR function with another in the second, we learned the 3-bit XOR function (01101001). Three SNUs in the first layer and one SNU in the second (32 organic memristive devices total) are required to resolve this problem.…”
Section: Resultsmentioning
confidence: 99%
“…To show the former, we emulated a multi-layer perceptron system by feeding forward functions learned in a first layer to build the truth tables of linearly non-separable functions in subsequent layers 40 . Similarly to 44 , which showed a multi-layer memristive system can learn AND and NOT in a first layer with two SNUs and subsequently the 2-bit XOR function with another in the second, we learned the 3-bit XOR function (01101001). Three SNUs in the first layer and one SNU in the second (32 organic memristive devices total) are required to resolve this problem.…”
Section: Resultsmentioning
confidence: 99%
“…Regarding the neuron designs, our NoProp digital design uses the simple, low-power CMOS inverter design proposed and successfully demonstrated in [62]. This system has a total energy footprint of less than 10 fJ per neuron.…”
Section: E Learning Energy Analysismentioning
confidence: 99%
“…This scheme is generic to several classes of nanosynapses and highly resistant to device imperfections [13]. The training cell can use stateful memristive devices to correctly route programming pulses to output neurons with the corresponding error case in the readout crossbar layer [14]; the chosen nanosynapse has been directly integrated in such a scheme [15]. Fig.…”
Section: B Nanoelectric Implementationmentioning
confidence: 99%