2022
DOI: 10.3389/frai.2022.680165
|View full text |Cite
|
Sign up to set email alerts
|

A Synaptic Pruning-Based Spiking Neural Network for Hand-Written Digits Classification

Abstract: A spiking neural network model inspired by synaptic pruning is developed and trained to extract features of hand-written digits. The network is composed of three spiking neural layers and one output neuron whose firing rate is used for classification. The model detects and collects the geometric features of the images from the Modified National Institute of Standards and Technology database (MNIST). In this work, a novel learning rule is developed to train the network to detect features of different digit clas… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 53 publications
(67 reference statements)
0
4
0
Order By: Relevance
“…Synapses connecting neurons with high spiking correlation are preserved, while synapses with poor or uncorrelated spiking activity are pruned [64]. Weight dropout also mitigates overfitting in neural networks trained with large size data sets by preventing unwanted specialization towards details and noise in the training data and allowing better generalization [65].…”
Section: Discussionmentioning
confidence: 99%
“…Synapses connecting neurons with high spiking correlation are preserved, while synapses with poor or uncorrelated spiking activity are pruned [64]. Weight dropout also mitigates overfitting in neural networks trained with large size data sets by preventing unwanted specialization towards details and noise in the training data and allowing better generalization [65].…”
Section: Discussionmentioning
confidence: 99%
“…Moreover, unlike the methods in [26,27], the SSR methods can calculate the gradient without observing the membrane potential, which may simplify the learning system on hardware. Finally, in addition to the reduction in the firing rate, the combination of binarized weights [48] and pruned weights [49,27,50] is expected to make the SNN model more suitable for hardware implementation.…”
Section: Discussionmentioning
confidence: 99%
“…Every neuron processes 785 inputs, each input being multiplied by a corresponding weight. The aggregated sum of these products yields a final score, which is subsequently normalized using the sigmoid function [8].…”
Section: Algorithmmentioning
confidence: 99%