2022
DOI: 10.48550/arxiv.2207.11108
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Inference skipping for more efficient real-time speech enhancement with parallel RNNs

Xiaohuai Le,
Tong Lei,
Kai Chen
et al.

Abstract: Deep neural network (DNN) based speech enhancement models have attracted extensive attention due to their promising performance. However, it is difficult to deploy a powerful DNN in real-time applications because of its high computational cost. Typical compression methods such as pruning and quantization do not make good use of the data characteristics. In this paper, we introduce the Skip-RNN strategy into speech enhancement models with parallel RNNs. The states of the RNNs update intermittently without inter… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 38 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?