Interspeech 2021 2021
DOI: 10.21437/interspeech.2021-571
|View full text |Cite
|
Sign up to set email alerts
|

Momentum Pseudo-Labeling for Semi-Supervised Speech Recognition

Abstract: Pseudo-labeling (PL) has been shown to be effective in semisupervised automatic speech recognition (ASR), where a base model is self-trained with pseudo-labels generated from unlabeled data. While PL can be further improved by iteratively updating pseudo-labels as the model evolves, most of the previous approaches involve inefficient retraining of the model or intricate control of the label update. We present momentum pseudo-labeling (MPL), a simple yet effective strategy for semisupervised ASR. MPL consists o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
10
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 23 publications
(11 citation statements)
references
References 36 publications
(71 reference statements)
1
10
0
Order By: Relevance
“…( 22)) used to derive α in the 20)). Note that the figures are reproduced from our previous paper [47]. We observed a similar trend among the curves in different semi-supervised settings (Figs.…”
Section: B Effectiveness Of W For Tuning the Momentum Updatesupporting
confidence: 81%
See 2 more Smart Citations
“…( 22)) used to derive α in the 20)). Note that the figures are reproduced from our previous paper [47]. We observed a similar trend among the curves in different semi-supervised settings (Figs.…”
Section: B Effectiveness Of W For Tuning the Momentum Updatesupporting
confidence: 81%
“…• We perform thorough analyses to confirm the effectiveness of MPL and propose several methods to further improve ASR performance. This paper summarizes our previous studies on MPL [47], [48] with the following extensions: we provide more detailed explanations of relationship to prior works (Section II) and precise formulations of end-to-end ASR and pseudo-labeling (Section III); we present a consistent description of [47] and [48], with more specific implementations (Section IV); we conduct experiments on a variety of semi-supervised scenarios, including additional experiments on smaller and larger amounts of labeled data (Section V); and we further demonstrate the effectiveness of MPL through more detailed experimental results and discussions (Section VI).…”
Section: Introductionmentioning
confidence: 79%
See 1 more Smart Citation
“…We adopt continuous PL (shown in Fig. 2c) [23,24] to compute the L ASR in both stage 1 and stage 2. Note that the continuous PL approach could also be used in other FL approaches like FedNorm and FedExtract.…”
Section: Unsupervised Training With Continuous Plmentioning
confidence: 99%
“…Then, some training burdens are moved to the server, thus reducing computation on clients. Additionally, DecoupleFL adopts pseudo-labeling (PL) approaches [23,24] for unsupervised learning, avoiding the unrealistic labeled data assumption. Moreover, one potential concern is communicating features might lead to privacy leakage.…”
Section: Introductionmentioning
confidence: 99%