2021
DOI: 10.48550/arxiv.2108.01804
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Online Training of Spiking Recurrent Neural Networks with Phase-Change Memory Synapses

Yigit Demirag,
Charlotte Frenkel,
Melika Payvand
et al.

Abstract: Spiking Recurrent Neural Networks (RNNs) are a promising tool for solving a wide variety of complex cognitive and motor tasks, due to their rich temporal dynamics and sparse processing. However training spiking RNNs on dedicated neuromorphic hardware is still an open challenge. This is due mainly to the lack of local, hardware-friendly learning mechanisms that can solve the temporal credit assignment problem and ensure stable network dynamics, even when the weight resolution is limited. These challenges are fu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 48 publications
0
2
0
Order By: Relevance
“…In the presence of a teaching signal, a supervised learning framework can be used. For online and on-chip learning systems using events, this can be done through approximations of Backpropagation Through Time [42][43][44] , which can be implemented both on digital 45 , or in-memory memristive neuromorphic hardware 46 . However, in the absence of supervision, MEMSORN-like hardware changes its structure and self-organizes to cluster the input signal.…”
Section: Comparison To Other Neuromorphic Self-organizing Networkmentioning
confidence: 99%
“…In the presence of a teaching signal, a supervised learning framework can be used. For online and on-chip learning systems using events, this can be done through approximations of Backpropagation Through Time [42][43][44] , which can be implemented both on digital 45 , or in-memory memristive neuromorphic hardware 46 . However, in the absence of supervision, MEMSORN-like hardware changes its structure and self-organizes to cluster the input signal.…”
Section: Comparison To Other Neuromorphic Self-organizing Networkmentioning
confidence: 99%
“…A simulation framework of differential-architecture crossbar arrays is developed in [27] to simulate spiking recurrent neural networks with PCM.…”
Section: A Related Workmentioning
confidence: 99%