2020
DOI: 10.48550/arxiv.2002.11474
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

RTMobile: Beyond Real-Time Mobile Acceleration of RNNs for Speech Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 28 publications
0
7
0
Order By: Relevance
“…This leads to irregular weight distribution and inevitably introduces extra indices to store the locations of pruned weights. Eventually, this drawback limits performance acceleration [8].…”
Section: Background 21 Dnn Model Pruningmentioning
confidence: 99%
See 3 more Smart Citations
“…This leads to irregular weight distribution and inevitably introduces extra indices to store the locations of pruned weights. Eventually, this drawback limits performance acceleration [8].…”
Section: Background 21 Dnn Model Pruningmentioning
confidence: 99%
“…Pruning Algorithms. There are three main issues of the existing pattern-based pruning algorithms [8,35]: (1) The existing algorithms select pattern for each kernel via estimating weight importance based on magnitude. However, this estimation approach requires a well-trained model, whose weights will not change dramatically, which is not true for training from scratch.…”
Section: Issues Of Existing Patternmentioning
confidence: 99%
See 2 more Smart Citations
“…Different from the prior work on coarse-grained pruning and NAS that find a smaller, yet regular, DNN structure, recent work [16], [48], [58] propose to prune the weights in a more fine-grained manner, e.g., assigning potentially different patterns to kernels. Higher accuracy can be achieved as a result of the intra-kernel flexibility, while high hardware parallelism (and mobile inference acceleration) can be achieved with the assist of compiler-level code generation techniques [58].…”
Section: Introductionmentioning
confidence: 99%