2018 International Joint Conference on Neural Networks (IJCNN) 2018
DOI: 10.1109/ijcnn.2018.8489410
|View full text |Cite
|
Sign up to set email alerts
|

Mastering the Output Frequency in Spiking Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2

Relationship

3
4

Authors

Journals

citations
Cited by 13 publications
(8 citation statements)
references
References 25 publications
0
8
0
Order By: Relevance
“…An exponential leak is applied to help neurons with weak activities. However, this mechanism uses two parameters, which makes the search of suited values difficult [13]. Moreover, those parameters do not enhance the convergence towards the different types of patterns shown in Figure 2.…”
Section: A Time Target Threshold Adaptationmentioning
confidence: 99%
“…An exponential leak is applied to help neurons with weak activities. However, this mechanism uses two parameters, which makes the search of suited values difficult [13]. Moreover, those parameters do not enhance the convergence towards the different types of patterns shown in Figure 2.…”
Section: A Time Target Threshold Adaptationmentioning
confidence: 99%
“…Computational neuroscience has provided a new idea for unsupervised learning mechanisms. Spiking Time Dependent Plasticity (STDP) (Abbott and Nelson, 2000;Song et al, 2000;Caporale and Dan, 2008;Tavanaei et al, 2018;Falez et al, 2019), is a temporally asymmetric form of Hebbian learning and is the most widely used unsupervised learning mechanism in SNN. In the temporal dimension, the relation between the presynaptic action potential and the postsynaptic action potential regulates the neurons' weights, which is a feature unique to SNN.…”
Section: A New Approach Of Unsupervised Learningmentioning
confidence: 99%
“…We observed that during the learning process, some neurons fail to learn a specific class, which affects the network performance resulting in false classifications. On the other hand, applying Progressive Pruning on the network results in a decrease of the average spiking network frequency, which will affect the energy consumption of our network positively, but may cause a frequency loss in multilayer networks, which is a known issue in Convolutional Spiking Neural Networks (CSNN) as described in [24]. Based on those observations, we propose a Dynamic Synaptic Weight Reinforcement (DSWR), which concerns the synapses that are conserved and considered as critical, to improve the network performance, by pushing the neurons that did not specialize in a specific pattern or class to do so, and keep the average spiking frequency of the network near the baseline.…”
Section: B Dynamic Synaptic Weight Reinforcementmentioning
confidence: 99%
“…Applying PP on the network results in a decrease in the average spiking network frequency, which will affect the energy consumption of our network positively. However, it may cause a frequency loss in multilayer networks, which is a known issue in convolutional spiking neural networks (CSNNs) as described in Reference 37.…”
Section: Contributionmentioning
confidence: 99%