The 2018 Conference on Artificial Life 2018
DOI: 10.1162/isal_a_00055
|View full text |Cite
|
Sign up to set email alerts
|

The Evolution of Training Parameters for Spiking Neural Networks with Hebbian Learning

Abstract: Spiking neural networks, thanks to their sensitivity to the timing of the inputs, are a promising tool for unsupervised processing of spatio-temporal data. However, they do not perform as well as the traditional machine learning approaches and their real-world applications are still limited. Various supervised and reinforcement learning methods for optimising spiking neural networks have been proposed, but more recently the evolutionary approach regained attention as a tool for training neural networks. Here, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 23 publications
0
3
0
Order By: Relevance
“…Another important feature is spike-timing dependent plasticity (STDP) [34], which can be used to implement Hebbian learning [35], [36]-it is the ability of a synapse to modify its state (typically its weight) depending on local network activity (pre-and postsynaptic spikes). Plasticity is implemented by adding an Update Synapses step to the pipeline.…”
Section: Methodsmentioning
confidence: 99%
“…Another important feature is spike-timing dependent plasticity (STDP) [34], which can be used to implement Hebbian learning [35], [36]-it is the ability of a synapse to modify its state (typically its weight) depending on local network activity (pre-and postsynaptic spikes). Plasticity is implemented by adding an Update Synapses step to the pipeline.…”
Section: Methodsmentioning
confidence: 99%
“…Recent works blending EC and SNNs are mostly focused on evolving a network's weights, using evolutionary approaches as a learning strategy [20,29,30]. Schuman et al [37] proposed Evolutionary Optimization for Neuromorphic Systems, aiming to train spiking neural networks for classification and control tasks, to train under hardware constraints, to evolve a reservoir for a liquid state machine, and to evolve smaller networks using multi-objective optimization.…”
Section: Related Workmentioning
confidence: 99%
“…An important class of learning rules for SNNs are Hebbian learning algorithms, in particular the classical Hebb rule (Hebb 1949;Oja 1982) and its generalisation spike timing dependent plasticity (STDP) (Caporale and Dan 2008;Kozdon and Bentley 2018;Lobov et al 2020;Białas et al 2020;Long 2011). Unlike backpropagationbased algorithms (Shrestha et al 2018;Fang 2021) Hebbian and STDP algorithms can be implemented on neuromorphic hardware and thus benefit from its advantages.…”
Section: Introductionmentioning
confidence: 99%