2022
DOI: 10.48550/arxiv.2204.05422
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SATA: Sparsity-Aware Training Accelerator for Spiking Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
13
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(13 citation statements)
references
References 0 publications
0
13
0
Order By: Relevance
“…Original SNNs Near-Memory Computing Analog-Digital-Mixed, Large-Scale BrainScaleS [87] SNNs, Learning Near-Memory Computing Analog-Digital-Mixed, Large-Scale SpiNNaker [81] SNNs, Learning Near-Memory Computing Digital, Large-Scale TrueNorth [14] SNNs Near-Memory Computing Digital, Large-Scale Darwin [83] SNNs Near-Memory Computing Digital, Small-Scale ROLLS [88] SNNs, Learning Near-Memory Computing Analog-Digital-Mixed, Small-Scale DYNAPs [94] SNNs Near-Memory Computing Analog-Digital-Mixed, Small-Scale Loihi [41] SNNs, Learning Near-Memory Computing Digital, Large-Scale Tianjic [85], [86] ANNs & SNNs Near-Memory Computing Digital, Large-Scale ODIN [89] SNNs, Learning Near-Memory Computing Digital, Small-Scale MorphIC [90] SNNs, Learning Near-Memory Computing Digital, Small-Scale DYNAPs-CNN/DYNAP-SE [44] SNNs Near-Memory Computing Digital, Small-Scale FlexLearn [99] SNNs, Learning ANN Accelerator Variants Digital, Large-Scale SpinalFlow [100] SNNs ANN Accelerator Variants Digital, Small-Scale H2Learn [101] SNNs, Learning ANN Accelerator Variants Digital, Large-Scale SATA [102] SNNs, Learning ANN Accelerator Variants Digital, Small-Scale BrainScaleS 2 [82] ANNs & SNNs, Learning Near-Memory Computing Digital, Large-Scale SpiNNaker 2 [95] ANNs & SNNs, Learning Near-Memory Computing Digital, Large-Scale Y. Kuang et al [282] ANNs & SNNs Near-Memory Computing Digital, Large-Scale SRAM/DRAM/Flash-based ANNs, SNNs In-Memory Computing Digital, Small-Scale Memristor-based ANNs, SNNs In-Memory Computing Analog-Digital-Mixed, Small-Scale neuromorphic workloads. The learning of SNNs on GPU is inefficient and hard to optimize [91].…”
Section: Chip Familymentioning
confidence: 99%
See 3 more Smart Citations
“…Original SNNs Near-Memory Computing Analog-Digital-Mixed, Large-Scale BrainScaleS [87] SNNs, Learning Near-Memory Computing Analog-Digital-Mixed, Large-Scale SpiNNaker [81] SNNs, Learning Near-Memory Computing Digital, Large-Scale TrueNorth [14] SNNs Near-Memory Computing Digital, Large-Scale Darwin [83] SNNs Near-Memory Computing Digital, Small-Scale ROLLS [88] SNNs, Learning Near-Memory Computing Analog-Digital-Mixed, Small-Scale DYNAPs [94] SNNs Near-Memory Computing Analog-Digital-Mixed, Small-Scale Loihi [41] SNNs, Learning Near-Memory Computing Digital, Large-Scale Tianjic [85], [86] ANNs & SNNs Near-Memory Computing Digital, Large-Scale ODIN [89] SNNs, Learning Near-Memory Computing Digital, Small-Scale MorphIC [90] SNNs, Learning Near-Memory Computing Digital, Small-Scale DYNAPs-CNN/DYNAP-SE [44] SNNs Near-Memory Computing Digital, Small-Scale FlexLearn [99] SNNs, Learning ANN Accelerator Variants Digital, Large-Scale SpinalFlow [100] SNNs ANN Accelerator Variants Digital, Small-Scale H2Learn [101] SNNs, Learning ANN Accelerator Variants Digital, Large-Scale SATA [102] SNNs, Learning ANN Accelerator Variants Digital, Small-Scale BrainScaleS 2 [82] ANNs & SNNs, Learning Near-Memory Computing Digital, Large-Scale SpiNNaker 2 [95] ANNs & SNNs, Learning Near-Memory Computing Digital, Large-Scale Y. Kuang et al [282] ANNs & SNNs Near-Memory Computing Digital, Large-Scale SRAM/DRAM/Flash-based ANNs, SNNs In-Memory Computing Digital, Small-Scale Memristor-based ANNs, SNNs In-Memory Computing Analog-Digital-Mixed, Small-Scale neuromorphic workloads. The learning of SNNs on GPU is inefficient and hard to optimize [91].…”
Section: Chip Familymentioning
confidence: 99%
“…Recently, backpropagation through time (BPTT) learning has been applied to SNNs and demonstrated much higher accuracy compared to bio-plausible synaptic plasticity rules [67], [75]. Several works such as H2Learn [101] and SATA [102] design specific architectures for BPTT learning of SNNs. In the future, the incorporation of learning rules will be increasingly critical for BIC chips to explore large and complex neuromorphic models.…”
Section: Chip Familymentioning
confidence: 99%
See 2 more Smart Citations
“…From the functionality perspective, existing BIC chips can be classified into three categories: supporting SNNs, or supporting both SNNs and ANNs [85], [86], and supporting learning rules [87]- [92]. From the architecture perspective, BIC chips belong to one of the following categories: near-memorycomputing architectures [14], [41], [44], [81]- [83], [85]- [90], [93]- [95] in-memory-computing architectures [96]- [98] and ANN accelerator variants [99]- [102]. From the implementation perspective, the trade-off becomes more complicated because many factors, including application scenarios, PPA (performance, power, and area), and programmability, should be comprehensively considered [103].…”
Section: Introduction Motivation and Overviewmentioning
confidence: 99%