2020
DOI: 10.1109/access.2020.2994360
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Spiking Neural Networks With Logarithmic Temporal Coding

Abstract: A Spiking Neural Network (SNN) can be trained indirectly by first training an Artificial Neural Network (ANN) with the conventional backpropagation algorithm, then converting it into an equivalent SNN. To reduce the computational cost of the resulting SNN as measured by the number of spikes, we present Logarithmic Temporal Coding (LTC), where the number of spikes used to encode an activation grows logarithmically with the activation value; and the accompanying Exponentiate-and-Fire (EF) neuron model, which onl… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(8 citation statements)
references
References 20 publications
0
6
0
Order By: Relevance
“…Another direction is to use temporal instead of rate-based spike coding to represent ANN activities. Zhang et al [128] have shown that temporal coding using a logarithmic time scale is the most efficient, second only to time-to-first-spike (TTFS) [129] coding. However, compared to TTFS, the logarithmic time scale approach is more compatible with popular deep learning techniques.…”
Section: O U T L O O K a Deep Networkmentioning
confidence: 99%
“…Another direction is to use temporal instead of rate-based spike coding to represent ANN activities. Zhang et al [128] have shown that temporal coding using a logarithmic time scale is the most efficient, second only to time-to-first-spike (TTFS) [129] coding. However, compared to TTFS, the logarithmic time scale approach is more compatible with popular deep learning techniques.…”
Section: O U T L O O K a Deep Networkmentioning
confidence: 99%
“…A different approach uses a global referenced binary coding to reduce the number of spikes. Together with neuron models with exponential input characteristics, the same activation of the neuron can be reached as with a count rate code but with far less spikes [110].…”
Section: Rate Codingmentioning
confidence: 95%
“…Here, each spike corresponds to a "1" or "0" in a bit stream. In relation to a fixed reference clock, two schemes to encode the bits are possible: the presence or absence of a spike within a given interval [110], or the timing of the spike within the interval [33]. In the former case, a logical "1" corresponds to a spike being present during one clock cycle.…”
Section: Global Referencedmentioning
confidence: 99%
“…The other coding scheme is phase (or binary) coding that combines the advantages of both rate and temporal coding [20], [21], [29]. The phase coding assigns different weights w t to each time segment t within T so that activation values are encoded by both the number of spikes and the spike pattern.…”
Section: A Activation Encoding In Ann-to-snn Conversionmentioning
confidence: 99%
“…Each spike at t has a phase Q −t where Q = 2 for binary coding. [29]. The latter is used to realize the phase coding in this paper.…”
Section: B Neuron Modelmentioning
confidence: 99%