2019
DOI: 10.1609/aaai.v33i01.33011319
|View full text |Cite
|
Sign up to set email alerts
|

TDSNN: From Deep Neural Networks to Deep Spike Neural Networks with Temporal-Coding

Abstract: Continuous-valued deep convolutional networks (DNNs) can be converted into accurate rate-coding based spike neural networks (SNNs). However, the substantial computational and energy costs, which is caused by multiple spikes, limit their use in mobile and embedded applications. And recent works have shown that the newly emerged temporal-coding based SNNs converted from DNNs can reduce the computational load effectively. In this paper, we propose a novel method to convert DNNs to temporal-coding SNNs, called TDS… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
44
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 48 publications
(47 citation statements)
references
References 19 publications
1
44
0
Order By: Relevance
“…With the conversion, deep SNNs achieved comparable results to DNNs in various applications, including image classification [23] and multi-object detection [25]. However, conversion approaches are limited in efficiency [31], which is defined with latency and the number of spikes, even if the converted SNNs adopted several temporal coding schemes, such as TTFS coding [16,17].…”
Section: Training Methods Of Deep Snnsmentioning
confidence: 99%
See 4 more Smart Citations
“…With the conversion, deep SNNs achieved comparable results to DNNs in various applications, including image classification [23] and multi-object detection [25]. However, conversion approaches are limited in efficiency [31], which is defined with latency and the number of spikes, even if the converted SNNs adopted several temporal coding schemes, such as TTFS coding [16,17].…”
Section: Training Methods Of Deep Snnsmentioning
confidence: 99%
“…Neural coding defines the way of representing the information in the form of spike trains including encoding and decoding function [22]. There have been various types of neural coding, such as rate [23,24,25], phase [26], burst [27], temporal-switching-coding (TSC) [28], and TTFS coding [12,16,17,15]. To maximize the efficiency by fully utilizing the temporal information in spike train, TTFS coding, which is known as latency coding, was introduced in SNNs [12].…”
Section: Spiking Neural Networkmentioning
confidence: 99%
See 3 more Smart Citations