2012 6th IEEE INTERNATIONAL CONFERENCE INTELLIGENT SYSTEMS 2012
DOI: 10.1109/is.2012.6335110
|View full text |Cite
|
Sign up to set email alerts
|

Evolving spiking neural networks for spatio-and spectro-temporal pattern recognition

Abstract: This paper provides a survey on the evolution of the evolving connectionist systems (ECOS) paradigm, from simple ECOS introduced in 1998 to evolving spiking neural networks (eSNN) and neurogenetic systems. It presents methods for their use for spatio-and spectro temporal pattern recognition. Future directions are highlighted. AbstractThis paper provides a survey on the evolution of the evolving connectionist systems (ECOS) paradigm, from simple ECOS introduced in 1998 to evolving spiking neural networks (eSNN… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
3
3
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 11 publications
(9 citation statements)
references
References 49 publications
(58 reference statements)
0
9
0
Order By: Relevance
“…Spiking neural networks (SNN) are advanced machine learning techniques. They simulate the behavior of biological neural networks by creating and updating connections between spiking neurons (synaptic connections) to learn temporal associations between them [1,2,[13][14][15][16][17][18][19][20][21]. SNN are considered the third generation of arti cial neural networks that have advantages over traditional neural networks due to the processing and incremental learning of temporal information.…”
Section: Spiking Neural Network (Snn)mentioning
confidence: 99%
“…Spiking neural networks (SNN) are advanced machine learning techniques. They simulate the behavior of biological neural networks by creating and updating connections between spiking neurons (synaptic connections) to learn temporal associations between them [1,2,[13][14][15][16][17][18][19][20][21]. SNN are considered the third generation of arti cial neural networks that have advantages over traditional neural networks due to the processing and incremental learning of temporal information.…”
Section: Spiking Neural Network (Snn)mentioning
confidence: 99%
“…Although it is widely accepted that the graded activation of these analog neurons is equivalent to the firing rate of spiking neurons inside the biological neural network (Rueckauer et al 2017), the view that precise spike timing encodes information in the temporal domain that is important for neural computation is just as prominent in the neuroscience community (Gerstner and Kistler 2002;Gütig and Sompolinsky 2006). Consequently, the spiking neuron models are proposed to describe the dynamic of spike generation process, such that the additional temporal information including precise spike timing and phase could be better captured (Kasabov et al 2013).…”
Section: Introductionmentioning
confidence: 99%
“…One well-known single-spike learning algorithm is the Tempotron (Gütig and Sompolinsky 2006), whereby the output neuron is trained to fire a single spike in response to the correct input patterns and remain silent otherwise. The Rank-Order learning algorithms (Kasabov et al 2013) update synaptic weights based on the rank order of the arrival time of incoming spikes. Additionally, the time-to-first spike decoding scheme is employed at the output layer.…”
Section: Introductionmentioning
confidence: 99%
“…In contrast, spiking NNs can harness their inherent sensitivity to time to discover such temporal correlations by taking into account the relative timings between spikes. However, while various mechanisms, applications and implementations for spike-based processing have been studied in depth [5,7,12,13,15,21,22,24], surprisingly little attention has been paid to the problem of converting real-valued input into spike trains. Note that this is separate to the issue of encoding information using spike trains, which has been investigated in detail (e.g., see [29] for a survey of encoding techniques).…”
Section: Introductionmentioning
confidence: 99%