2015
DOI: 10.1109/tnnls.2015.2404938
|View full text |Cite
|
Sign up to set email alerts
|

DL-ReSuMe: A Delay Learning-Based Remote Supervised Method for Spiking Neurons

Abstract: Recent research has shown the potential capability of spiking neural networks (SNNs) to model complex information processing in the brain. There is biological evidence to prove the use of the precise timing of spikes for information coding. However, the exact learning mechanism in which the neuron is trained to fire at precise times remains an open problem. The majority of the existing learning methods for SNNs are based on weight adjustment. However, there is also biological evidence that the synaptic delay i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
57
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 83 publications
(57 citation statements)
references
References 40 publications
(59 reference statements)
0
57
0
Order By: Relevance
“…Each input neuron is randomly connected to a fraction number of hidden neurons as used in [18]. The LIF neuron model described in [41] is used. The proposed method trains the spiking network by adjusting the learning parameters of the hidden and output neurons in parallel.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Each input neuron is randomly connected to a fraction number of hidden neurons as used in [18]. The LIF neuron model described in [41] is used. The proposed method trains the spiking network by adjusting the learning parameters of the hidden and output neurons in parallel.…”
Section: Methodsmentioning
confidence: 99%
“…Consider a pattern from class i is applied to the network and an actual output of the network is generated. The correlation between the actual output and the corresponding desired spike train of the class i is called c i which is calculated by the method used in [41] as in…”
Section: Classification Ability Of the Proposed Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…We note that a set of basis functions, along with the weight matrices, describes the spatio-temporal receptive field of the neurons [20]. Examples of basis functions include raised cosines with different synaptic delays [20]- [22], as illustrated in Fig. 2.…”
Section: Dynamic Exponential Family Modelmentioning
confidence: 99%
“…The most documented evidence for supervised learning in the central nervous system (CNS) comes from the studies on the cerebellum and the cerebellar cortex [21], [22]. However, the exact mechanisms underlying supervised learning in the biological neurons remain an open problem [23], [24]. To date, many supervised learning methods have been proposed in order to train the spiking neurons to generate desired sequences of spikes.…”
Section: Introductionmentioning
confidence: 99%