2016 IEEE International Workshop on Signal Processing Systems (SiPS) 2016
DOI: 10.1109/sips.2016.61
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic Computing Can Improve Upon Digital Spiking Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
2
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 23 publications
(9 citation statements)
references
References 20 publications
0
9
0
Order By: Relevance
“…The proposed algorithms can be extended to other spiking and non-spiking hardware substrates that have the ability to perform stochastic computing and provides the capability to have recurrent neural network connections. Prior work such as Smithson et al ( 2016 ), Thakur et al ( 2016 ), and Cassidy et al ( 2013 ) show that digital spiking neural substrates can perform stochastic computing. We can also perform stochastic computing on non-spiking hardware substrates such as FPGAs (Li et al, 2016 ), FinFETs (Zhang et al, 2017 ), and magnetic-tunnel-junction (Lv and Wang, 2017 ).…”
Section: Discussionmentioning
confidence: 99%
“…The proposed algorithms can be extended to other spiking and non-spiking hardware substrates that have the ability to perform stochastic computing and provides the capability to have recurrent neural network connections. Prior work such as Smithson et al ( 2016 ), Thakur et al ( 2016 ), and Cassidy et al ( 2013 ) show that digital spiking neural substrates can perform stochastic computing. We can also perform stochastic computing on non-spiking hardware substrates such as FPGAs (Li et al, 2016 ), FinFETs (Zhang et al, 2017 ), and magnetic-tunnel-junction (Lv and Wang, 2017 ).…”
Section: Discussionmentioning
confidence: 99%
“…Stochastic computation techniques apply randomness to the process of computation itself (Shanbhag et al, 2010). Variants of this approach have been applied to spiking neural networks (Rosselló et al, 2012;Ahmed et al, 2016;Smithson et al, 2016). These are mostly orthogonal to the ideas we discuss, since a different (stochastic) hardware architecture for elementary compute units can also be incorporated into our approach which introduces randomness in the process of spike propagation.…”
Section: Stochastic Techniquesmentioning
confidence: 99%
“…Despite the great similarities between SNNs and SC-based neural networks in inference computations [99], the training procedure of SNNs is significantly different from that of SCbased neural networks. SNNs are trained using an unsupervised biological learning method called spike-timing-dependent plasticity (STDP), where weights are updated depending on the relative timing of incoming and outgoing spikes of neurons [100].…”
Section: A Neuromorphic Computingmentioning
confidence: 99%