2022
DOI: 10.1109/tvlsi.2022.3196839
|View full text |Cite
|
Sign up to set email alerts
|

Cerebron: A Reconfigurable Architecture for Spatiotemporal Sparse Spiking Neural Networks

Abstract: Spiking neural networks (SNNs) are promising 1 alternatives to artificial neural networks (ANNs) since they 2 are more realistic brain-inspired computing models. SNNs have 3 sparse neuron firing over time, i.e., spatiotemporal sparsity; thus, 4 they are helpful in enabling energy-efficient hardware inference. 5 However, exploiting the spatiotemporal sparsity of SNNs in 6 hardware leads to unpredictable and unbalanced workloads, 7 degrading the energy efficiency. Compared to SNNs with sim-8 ple fully connected … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 55 publications
0
6
0
Order By: Relevance
“…The third type of neuromorphic hardware follows the scheme of the ANN accelerator design except for constructing dedicated hardware for synaptic operations and explores optimal dataflow for SNNs specifically [20]- [26]. These types of work require less area cost and achieve higher computing resource utilization.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…The third type of neuromorphic hardware follows the scheme of the ANN accelerator design except for constructing dedicated hardware for synaptic operations and explores optimal dataflow for SNNs specifically [20]- [26]. These types of work require less area cost and achieve higher computing resource utilization.…”
Section: Related Workmentioning
confidence: 99%
“…where f is the system clock frequency, and M × N denotes the size of the systolic array. The peak GSOP/s calculation is the same as [20] and [24]. In FireFly, M denotes the number of columns in the systolic array, while N denotes the rows.…”
Section: B Bridging the Gap Between Peak And Avg Gsop/smentioning
confidence: 99%
See 1 more Smart Citation
“…The spikes propagate through the network until reaching the output. We adopt the integrate-and-fire (IF) neuron model in the Spiking ORNN [47]. Suppose the IF neurons in layer l receive binary input spikes s l (t) at timestep t, the temporary membrane potential of neurons is updated according to the following equation:…”
Section: Architecture Of Spiking-ornnmentioning
confidence: 99%
“…With heavy investment from industry and government, e.g. IBM, Intel, and DARPA, SNNs are expected to be part of the future artificial intelligence (AI) portfolio [46,47].…”
Section: Introductionmentioning
confidence: 99%