2021
DOI: 10.1109/access.2021.3083056
|View full text |Cite
|
Sign up to set email alerts
|

Spiking Neural Networks With Time-to-First-Spike Coding Using TFT-Type Synaptic Device Model

Abstract: In hardware-based spiking neural networks (SNNs), the conversion of analog input data into the arrival time of an input pulse is regarded as a good candidate for the encoding method due to its bioplausibility and power-efficiency. In this work, we trained an SNN encoded by time to first spike (TTFS) and performed an inference process using the behavior of the fabricated TFT-type flash synaptic device. The exponentially decaying synaptic current model required in the inference process was implemented by reading… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 22 publications
0
5
0
Order By: Relevance
“…In contrast, TTFS coding, which resembles natural vision, can provide important spatiotemporal information for the implementation of SNN to process dynamic visual data with high sparsity and high energy e ciency. Hardware-based SNN with TTFS coding are more e cient than SNN with rate coding, demonstrating faster speed and reduced energy consumption [23][24][25][26] . Although precise temporal encoding has been achieved in arti cial visual neuron systems, fusing rate and TTFS coding in a single spike train has not yet been realized in SNN hardware, compromising the capacity of such networks to rapidly and accurately process visual data in complex visual environments [27][28][29][30][31][32] .…”
Section: Introductionmentioning
confidence: 99%
“…In contrast, TTFS coding, which resembles natural vision, can provide important spatiotemporal information for the implementation of SNN to process dynamic visual data with high sparsity and high energy e ciency. Hardware-based SNN with TTFS coding are more e cient than SNN with rate coding, demonstrating faster speed and reduced energy consumption [23][24][25][26] . Although precise temporal encoding has been achieved in arti cial visual neuron systems, fusing rate and TTFS coding in a single spike train has not yet been realized in SNN hardware, compromising the capacity of such networks to rapidly and accurately process visual data in complex visual environments [27][28][29][30][31][32] .…”
Section: Introductionmentioning
confidence: 99%
“…As a result, the time resolution improves with CMOS scaling, leading to a growing interest in the time domain, as reported by [8,9]. This motivated the researchers to create electronic sensor systems that use spike or time-coded signals, which possess a technology-agnostic property that remains robust even as technology scales up, as demonstrated in [10][11][12][13][14][15]. A scalable ADC based on the neural engineering framework was proposed by the authors in [10].…”
Section: Introductionmentioning
confidence: 99%
“…The process of translating analog input values into input pulse frequencies through a large number of spikes can result in increased power consumption as the spiking neural network (SNN) structure grows deeper and larger. Consequently, these traits may not be well-suited for power-efficient and robust devices in edge computing [13,15].…”
Section: Introductionmentioning
confidence: 99%
“…TFS coding involves mapping input data into a single spike where the information is contained in the time taken to first spike. Hardware implementation of temporal encoding has focused only on the TFS approach in the past 12,15 due to its simplicity of circuit implementation. However, it can only represent single‐dimensional information 14 .…”
Section: Introductionmentioning
confidence: 99%