2017 75th Annual Device Research Conference (DRC) 2017
DOI: 10.1109/drc.2017.7999481
|View full text |Cite
|
Sign up to set email alerts
|

Supervised learning in spiking neural networks with MLC PCM synapses

Abstract: Spiking neural networks (SNN) are artificial computational models that have been inspired by the brain's ability to naturally encode and process information in the time domain. The added temporal dimension is believed to render them more computationally efficient than the conventional artificial neural networks, though their full computational capabilities are yet to be explored. Recently, computational memory architectures based on non-volatile memory crossbar arrays have shown great promise to implement para… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
11
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(11 citation statements)
references
References 43 publications
0
11
0
Order By: Relevance
“…which respectively correspond to the learning signal and eligibility trace, i.e., the running average of the gradients of the log-loss. The global update at the BS is then given by (4). As summarized in Algorithm 1, FL-SNN is based on local and global feedback signals, rather than backpropagation.…”
Section: Fl-snn: Fl With Distributed Snnsmentioning
confidence: 99%
See 2 more Smart Citations
“…which respectively correspond to the learning signal and eligibility trace, i.e., the running average of the gradients of the log-loss. The global update at the BS is then given by (4). As summarized in Algorithm 1, FL-SNN is based on local and global feedback signals, rather than backpropagation.…”
Section: Fl-snn: Fl With Distributed Snnsmentioning
confidence: 99%
“…As in [5], the local learning signal ℓ (i) (t), computed every ∆s, indicates to the hidden neurons within the SNN of each device i how effective their current signaling is in maximizing the probability of the desired input-output behavior defined by the selected data (x (i) , y (i) ). In contrast, the global feedback signal θ(t) is given by the global averaged parameter (4), which aims at enabling cooperative training via FL.…”
Section: Fl-snn: Fl With Distributed Snnsmentioning
confidence: 99%
See 1 more Smart Citation
“…45 Though a complete training algorithm of a multilayer spatiotemporal network is still missing, training the network to map a spatiotemporal input into a spatiotemporal output (Fig. 8a) is a critical step.…”
Section: Learning Of a Spiking Sequencementioning
confidence: 99%
“…To combine the fast training in crosspoint architectures with the flexibility of GPUs, recently proposed analog/digital hybrid systems implement the crossbar array as an analog computational unit able to largely accelerate the calculation of the MVM[65][66][67][68], thus allowing to relax the computational load on the digital section[68]. For instance, a dot-product engine was proposed where the resistive crossbar array efficiently calculates MVM via vector scalar product a·b = Σai bi, achieving in simulation software-like accuracy on MNIST digit recognition by implementing pre-trained weights and efficiently performing forward inference[65,66].…”
mentioning
confidence: 99%