2019
DOI: 10.3389/fnins.2019.00405
|View full text |Cite
|
Sign up to set email alerts
|

A Soft-Pruning Method Applied During Training of Spiking Neural Networks for In-memory Computing Applications

Abstract: Inspired from the computational efficiency of the biological brain, spiking neural networks (SNNs) emulate biological neural networks, neural codes, dynamics, and circuitry. SNNs show great potential for the implementation of unsupervised learning using in-memory computing. Here, we report an algorithmic optimization that improves energy efficiency of online learning with SNNs on emerging non-volatile memory (eNVM) devices. We develop a pruning method for SNNs by exploiting the output firing characteristics of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
43
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 38 publications
(43 citation statements)
references
References 52 publications
(69 reference statements)
0
43
0
Order By: Relevance
“…The software simulation is programmed in Python language and runs sequentially in a single process. The runtime decreases with the increasing pruning percentage (decreasing connectivity), which proves that the proposed online pruning method is able to shorten the network Online soft weight pruning Shi et al, 2019 12.23%/9.63% 27%/29% 0%/0%…”
Section: Implementation Overheadmentioning
confidence: 66%
See 4 more Smart Citations
“…The software simulation is programmed in Python language and runs sequentially in a single process. The runtime decreases with the increasing pruning percentage (decreasing connectivity), which proves that the proposed online pruning method is able to shorten the network Online soft weight pruning Shi et al, 2019 12.23%/9.63% 27%/29% 0%/0%…”
Section: Implementation Overheadmentioning
confidence: 66%
“…An online soft weight pruning method for unsupervised SNNs was reported in Shi et al (2019) . Unlike conventional pruning methods, instead of removing the pruned weights, this method sets the pruned weights constant at the lowest possible weight value or the current value and stops updating them for the rest of the training process.…”
Section: Comparisons and Discussionmentioning
confidence: 99%
See 3 more Smart Citations