2015
DOI: 10.1145/2629503
|View full text |Cite
|
Sign up to set email alerts
|

On-Chip Universal Supervised Learning Methods for Neuro-Inspired Block of Memristive Nanodevices

Abstract: Scaling down beyond CMOS transistors requires the combination of new computing paradigms and novel devices. In this context, neuromorphic architecture is developed to achieve robust and ultra-low power computing systems. Memristive nanodevices are often associated with this architecture to implement efficiently synapses for ultra-high density. In this article, we investigate the design of a neuro-inspired logic block (NLB) dedicated to on-chip function learning and propose learning strategy. It is composed of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
3
3
1

Relationship

4
3

Authors

Journals

citations
Cited by 11 publications
(10 citation statements)
references
References 56 publications
0
10
0
Order By: Relevance
“…Memristive crossbar circuits have also been demonstrated to be suitable for efficient neural network training by Irina et al [50], where they show low error rates using batch and stochastic training approaches for a handwritten digit recognition dataset. Neuroinspired devices have been developed for unsupervised learning by Chabi et al [51,52], as well as for an inference engine by Querlioz et al [53]. A general model for voltagecontrolled memristors has been developed by Kvatinsky et al [54].…”
Section: Non-cmos and Hybrid Solutionsmentioning
confidence: 99%
“…Memristive crossbar circuits have also been demonstrated to be suitable for efficient neural network training by Irina et al [50], where they show low error rates using batch and stochastic training approaches for a handwritten digit recognition dataset. Neuroinspired devices have been developed for unsupervised learning by Chabi et al [51,52], as well as for an inference engine by Querlioz et al [53]. A general model for voltagecontrolled memristors has been developed by Kvatinsky et al [54].…”
Section: Non-cmos and Hybrid Solutionsmentioning
confidence: 99%
“…Parasitic resistances are materialized by the R and R in variables. Adapted from [22] X i , then, will have two nanowires (X i+ , X i− ) at whose intersection with row j, two memristors (M ij+ , M ij− ) will encode a unique synaptic weight pairs as a difference function (G ij+ − G ij− ) for all input/neuron combinations in the entire array, as follows:…”
Section: A Learning Principlesmentioning
confidence: 99%
“…Now, the proper voltage level for programming pulses (V p+ , V p− ) must sit at either second threshold (V th2 or −V th2 ), since slight changes in either direction can induce an increment or decrement. The polarity of the pulse must follow the sign of the expected output for the row or function, Y j , with the condition that V p+ = −V th2 and V p− = +V th2 (since conductance drops along the second threshold) [22]. Given this, a single pulse fed simultaneously to signal lines S + and S − can then implement learning with only one programming pulse per cycle.…”
Section: B Programming Pulse Schemesmentioning
confidence: 99%
“…To prevent weight saturation the synaptic weight is weakened if the postsynaptic neuron fires first as such causality cannot be implied (the anti-Hebb rule). Although we focus on unsupervised Hebbian learning, we note that supervised learning approaches also exist for memristive neural networks [4].…”
Section: Memristive Spiking Networkmentioning
confidence: 99%
“…For each robot step (64ms in simulation time), the robot samples the six sensors: the six-dimensional input vector is then scaled so that the entire sensor range falls within [0,1], and is used as I in equation (4). The network is then run for 21 processing steps -experimentally determined to allow the bipolar synapses enough time to change synaptic plasticity to affect useful behaviour generation -and the spike trains at the output neurons discretised as having either high or low activation to generate an action (high activation if more than half of the 21 processing steps generated a spike at the neuron, low otherwise).…”
Section: A Controller Integrationmentioning
confidence: 99%