2021
DOI: 10.48550/arxiv.2104.10719
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Fully Spiking Hybrid Neural Network for Energy-Efficient Object Detection

Biswadeep Chakraborty,
Xueyuan She,
Saibal Mukhopadhyay

Abstract: This paper proposes a Fully Spiking Hybrid Neural Network (FSHNN) for energy-efficient and robust object detection in resource-constrained platforms. The network architecture is based on Convolutional SNN using leaky-integrate-fire neuron models. The model combines unsupervised Spike Time-Dependent Plasticity (STDP) learning with back-propagation (STBP) learning methods and also uses Monte Carlo Dropout to get an estimate of the uncertainty error. FSHNN provides better accuracy compared to DNN based object det… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 31 publications
0
1
0
Order By: Relevance
“…Spiking neural networks (SNNs) [1] use unsupervised bio-inspired neurons and synaptic connections, trainable with either biological learning rules such as spike-timingdependent plasticity (STDP) [2] or supervised statistical learning algorithms such as surrogate gradient [3]. Empirical results on standard SNNs also show good performance for various tasks, including spatiotemporal data classification, [4,5], sequence-tosequence mapping [6], object detection [7,8], and universal function approximation [9,10]. An important motivation for the application of SNN in machine learning (ML) is the sparsity in the firing (activation) of the neurons, which reduces energy dissipation during inference [11].…”
Section: Introductionmentioning
confidence: 99%
“…Spiking neural networks (SNNs) [1] use unsupervised bio-inspired neurons and synaptic connections, trainable with either biological learning rules such as spike-timingdependent plasticity (STDP) [2] or supervised statistical learning algorithms such as surrogate gradient [3]. Empirical results on standard SNNs also show good performance for various tasks, including spatiotemporal data classification, [4,5], sequence-tosequence mapping [6], object detection [7,8], and universal function approximation [9,10]. An important motivation for the application of SNN in machine learning (ML) is the sparsity in the firing (activation) of the neurons, which reduces energy dissipation during inference [11].…”
Section: Introductionmentioning
confidence: 99%