2018
DOI: 10.3389/fninf.2018.00079
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent Spiking Neural Network Learning Based on a Competitive Maximization of Neuronal Activity

Abstract: Spiking neural networks (SNNs) are believed to be highly computationally and energy efficient for specific neurochip hardware real-time solutions. However, there is a lack of learning algorithms for complex SNNs with recurrent connections, comparable in efficiency with back-propagation techniques and capable of unsupervised training. Here we suppose that each neuron in a biological neural network tends to maximize its activity in competition with other neurons, and put this principle at the basis of a new SNN … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
27
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 42 publications
(28 citation statements)
references
References 59 publications
0
27
0
Order By: Relevance
“…Also, the NSs with trainable competition between neurons could be implemented in hardware with excitatory and inhibitory neurons. [47]…”
Section: Dopamine-like Modulation Of Stdp With Different Spike Shapesmentioning
confidence: 99%
“…Also, the NSs with trainable competition between neurons could be implemented in hardware with excitatory and inhibitory neurons. [47]…”
Section: Dopamine-like Modulation Of Stdp With Different Spike Shapesmentioning
confidence: 99%
“…The non-linear behavior of memristive devices in response to electrical pulses together with their unique scalability are the most important advantages that determine a unique possibility of hardware implementation of SNN (Demin and Nekhaev, 2018;Guo et al, 2019) based on the processes of self-organization in neural network architectures and qualitatively different from traditional neural networks (perceptrons). We believe that implementation of brain-like networks of future generations will be based on the stochastic dynamics of memristors and synchronization of neural oscillators.…”
Section: Memristive Devices: Toward Cmos Integrationmentioning
confidence: 99%
“…A network inspired by the autoencoder of the neural networks literature is trained without labels layer-wise to reconstruct the MNIST and CIFAR-10 datasets, and whose output is trained in a supervised fashion to perform classification [27]. A learning algorithm for SNNs is developed on the supposition that neurons tend to maximize their activity in competition with other neurons in [28]. Three-layer networks of Izhikevich regular spiking neurons are trained with STDP on the binary task of distinguishing the digits 0 and 1 as well as on the full set of MNIST digits [29].…”
Section: Related Workmentioning
confidence: 99%