2022
DOI: 10.1109/access.2022.3149577
|View full text |Cite
|
Sign up to set email alerts
|

Neuron Circuits for Low-Power Spiking Neural Networks Using Time-To-First-Spike Encoding

Abstract: Hardware-based Spiking Neural Networks (SNNs) are regarded as promising candidates for the cognitive computing system due to its low power consumption and highly parallel operation. In this paper, we train the SNN in which the firing time carries information using temporal backpropagation. The temporally encoded SNN with 512 hidden neurons achieved an accuracy of 96.90% for the MNIST test set. Furthermore, the effect of the device variation on the accuracy in temporally encoded SNN is investigated and compare… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 12 publications
(10 citation statements)
references
References 53 publications
0
10
0
Order By: Relevance
“…Although incorporating an additional comparator could complicate the circuit operation of the TTFS approach compared to the conventional rate encoding method, this technique is highly desirable for temporal-efficient information processing as it only looks for a single spike regardless of the input intensity. [18,19,27] Figure 4e shows the extracted spike time from various input intensities with P th from 0.005 to 0.5. The exponential correlation between the spike time and input signal can be distinctly identified, in which the generated spike times are inversely proportional to the input intensities.…”
Section: Synaptic Plasticity and Spike Generation In Ferroelectric En...mentioning
confidence: 99%
See 2 more Smart Citations
“…Although incorporating an additional comparator could complicate the circuit operation of the TTFS approach compared to the conventional rate encoding method, this technique is highly desirable for temporal-efficient information processing as it only looks for a single spike regardless of the input intensity. [18,19,27] Figure 4e shows the extracted spike time from various input intensities with P th from 0.005 to 0.5. The exponential correlation between the spike time and input signal can be distinctly identified, in which the generated spike times are inversely proportional to the input intensities.…”
Section: Synaptic Plasticity and Spike Generation In Ferroelectric En...mentioning
confidence: 99%
“…Our ferroelectric encoder, synergetically exploiting the robust ferroelectric response and TTFS approach, significantly improves the encoding efficiency compared to other hardware encoders using the conventional rate encoding approach. [15][16][17][18][19][20] Although the rate encoding has been widely employed, redundant timesteps (e.g., >200 timesteps) are necessary to accurately observe the input intensity, which reduces the encoding efficiency. [20] Furthermore, the precision of our ferroelectric encoder is quantitatively assessed by the standard MNIST dataset using a simulated SNN to perform the classification task of handwritten digits.…”
Section: Spike Sparsity and Digits Classification Accuracymentioning
confidence: 99%
See 1 more Smart Citation
“…Historically, TTFS was first proposed to explain the phenomenal speed of processing in the brain for certain tasks, such as object recognition (Thorpe and Imbert, 1989 ). More recently, TTFS has attracted much attention from the AI community (Mostafa, 2017 ; Rueckauer and Liu, 2018 ; Zhou et al, 2019 ; Kheradpisheh and Masquelier, 2020 ; Park et al, 2020 ; Sakemi et al, 2020 ; Zhang et al, 2020 ; Comsa et al, 2021 ; Mirsadeghi et al, 2021 ), because it can be efficiently implemented on low power event-driven neuromorphic chips (Abderrahmane et al, 2020 ; Nair et al, 2020 ; Srivatsa et al, 2020 ; Göltz et al, 2021 ; Liang et al, 2021 ; Oh et al, 2022 ), leveraging two key features. The first one is sparsity (Frenkel, 2021 ).…”
Section: Introductionmentioning
confidence: 99%
“…It significantly increases the sparsity of the output spike train, thereby giving large energy savings. It has recently been used in SNNs for classification problems [12][13][14] and hardware implementation of such schemes is also being explored [15]. There is also building evidence of the biological plausibility of such temporal coding schemes [16,17].…”
Section: Introductionmentioning
confidence: 99%