2017
DOI: 10.1016/j.neucom.2016.10.061
|View full text |Cite
|
Sign up to set email alerts
|

An approximate backpropagation learning rule for memristor based neural networks using synaptic plasticity

Abstract: We describe an approximation to backpropagation algorithm for training deep neural networks, which is designed to work with synapses implemented with memristors. The key idea is to represent the values of both the input signal and the backpropagated delta value with a series of pulses that trigger multiple positive or negative updates of the synaptic weight, and to use the min operation instead of the product of the two signals. In computational simulations, we show that the proposed approximation to backpropa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 39 publications
(17 citation statements)
references
References 23 publications
0
17
0
Order By: Relevance
“…Furthermore, it is efficient to improve the characteristics of memristor synapses depending on individual neuromorphic networks, because a desirable memristor synapse capable of being employed into neuromorphic systems is yet to be reported. Supervised learning-based networks [35,[40][41][42][43][44], for example, are less vulnerable to cycle-to-cycle and device-to-device variations. This is because memristor synapses are updated according to calculated errors under known target values.…”
Section: (A) Cation-based Devices: Through Electrochemical Reaction mentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore, it is efficient to improve the characteristics of memristor synapses depending on individual neuromorphic networks, because a desirable memristor synapse capable of being employed into neuromorphic systems is yet to be reported. Supervised learning-based networks [35,[40][41][42][43][44], for example, are less vulnerable to cycle-to-cycle and device-to-device variations. This is because memristor synapses are updated according to calculated errors under known target values.…”
Section: (A) Cation-based Devices: Through Electrochemical Reaction mentioning
confidence: 99%
“…This influences output currents, leading to degradation of learning performance. The non-idealities in array-level could be overcome by device functions [35,44], operational scheme [39,[58][59][60], or learning algorithms [35,[40][41][42][43][44] to some degree.…”
Section: Neuromorphic Systems Based On Crossbar Array Of Memristor Symentioning
confidence: 99%
“…There are several works proposing the implementation of memristive neural network with backpropagation algorithm in digital and mixed-signal domain domain [4], [5], [6], [7], [8], [9], [10]. However, the analog learning circuits based on conventional backpropagation learning algorithm [11], [12], [8], [13], [14] in memristive crossbars have not been fully implemented. The implementation of such learning algorithm opens up an opportunity to create an analog hardware-based learning architecture.…”
Section: Introductionmentioning
confidence: 99%
“…The possibility of using memristors as synapses in neural networks has been extensively studied. The wealth of proposals in this field can be broadly split into two groups: one related to spike-timing-dependent plasticity (STDP) and spiking neural networks (SNN) [26][27][28][29][30], and the other to more traditional neural network models [31][32][33][34][35][36][37][38][39][40][41][42][43]. The first group has a more biological focus, with its main goal being the reproduction of effects occurring in natural neural networks, rather than algorith-mic improvements.…”
Section: Introductionmentioning
confidence: 99%