2019 IEEE 13th International Conference on ASIC (ASICON) 2019
DOI: 10.1109/asicon47005.2019.8983560
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Accelerator for Sparse Convolutional Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
7
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 6 publications
0
7
0
Order By: Relevance
“…However, these deep networks [14,21,25,33,36] can have multiple hidden layers, millions of parameters, billions of operations and require tremendous storage and intense computation resources, making it difficult to realize energy-efficient and high-performance solutions. To address this issue, several model compression techniques [10,17,18,20,26,38], efficient dataflow techniques [32,35], and dataflow accelerators [1,3,5,7,8,15,16,19,28,31,39,40,41,42,43] have been proposed and widely investigated in recent years. * These authors contributed equally to this work.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations
“…However, these deep networks [14,21,25,33,36] can have multiple hidden layers, millions of parameters, billions of operations and require tremendous storage and intense computation resources, making it difficult to realize energy-efficient and high-performance solutions. To address this issue, several model compression techniques [10,17,18,20,26,38], efficient dataflow techniques [32,35], and dataflow accelerators [1,3,5,7,8,15,16,19,28,31,39,40,41,42,43] have been proposed and widely investigated in recent years. * These authors contributed equally to this work.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, exploiting high sparsity in CNNs has emerged as a promising approach for achieving energy-efficient and high-performance CNN solutions [1,3,5,7,8,15,16,19,28,31,39,40,41,42,43]. Several sparse CNN (s-CNN) accelerators have recently been proposed to exploit the structured [28,43] as well as the unstructured sparsity [1,3,5,7,8,15,16,19,31,39,40,41,42] in both CNN model parameters and its activations.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…A preliminary conference version of our work was reported in [23]. We enhanced that work with a new loadbalance aware pruning method for better overall sparse computation efficiency as well as a kernel transformation method for arbitrary CONV kernels here.…”
Section: Introductionmentioning
confidence: 99%