2010
DOI: 10.1109/tnn.2010.2074212
|View full text |Cite
|
Sign up to set email alerts
|

SWAT: A Spiking Neural Network Training Algorithm for Classification Problems

Abstract: Abstract-This paper presents a synaptic weight association training (SWAT) algorithm for spiking neural networks (SNNs). SWAT merges the Bienenstock-Cooper-Munro (BCM) learning rule with spike timing dependent plasticity (STDP). The STDP/BCM rule yields a unimodal weight distribution where the height of the plasticity window associated with STDP is modulated causing stability after a period of training. The SNN uses a single training neuron in the training phase where data associated with all classes is passed… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
128
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 169 publications
(132 citation statements)
references
References 49 publications
0
128
0
Order By: Relevance
“…STDP is controlled by relative preand postsynaptic spike times. Equation (6) specifies that postsynaptic spikes which follow presynaptic spikes cause the synaptic weight to be increased (LTP) and in contrast, synapses are weakened when presynaptic spike occurs after postsynaptic spike generation time (LTD).…”
Section: Third Layer: Learning and Output Neuronsmentioning
confidence: 99%
See 1 more Smart Citation
“…STDP is controlled by relative preand postsynaptic spike times. Equation (6) specifies that postsynaptic spikes which follow presynaptic spikes cause the synaptic weight to be increased (LTP) and in contrast, synapses are weakened when presynaptic spike occurs after postsynaptic spike generation time (LTD).…”
Section: Third Layer: Learning and Output Neuronsmentioning
confidence: 99%
“…Spiking neurons and adaptive synapses between neurons contribute to a new approach in cognition, decision making, and learning [4][5][6][7][8].…”
Section: Introductionmentioning
confidence: 99%
“…All the LTAs use the packets to transmit the information [4]. The rate encoding scheme is used in this paper, and the SNN can be trained using the learning algorithm in previous work [11]. The Xilinx Zynq-7000 development board (with a XC7Z020-CLG484 FPGA device) is used as the hardware platform.…”
Section: Functional Evaluationmentioning
confidence: 99%
“…What is important in developing neural networks is their useful behavior by learning to recognize and apply relationships between objects and patterns of objects specific to the real world. In this respect neural networks are tools that can be used to solve difficult problems [8], [9], [10]. Artificial neural networks are inspired by the architecture of the biological nervous system, which consists of a large number of relatively simple neurons that work in parallel to facilitate rapid decision-making [11].…”
Section: Neural Network Based Modelmentioning
confidence: 99%