2022
DOI: 10.48550/arxiv.2203.03582
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Improving CTC-based speech recognition via knowledge transferring from pre-trained language models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 0 publications
0
3
0
Order By: Relevance
“…Also, neural network based FA strategy like NeuFA [?] and CIF [6,4] is proposed. Even Connectionist Temporal Classification (CTC) can be generally regarded as an FA strategy.…”
Section: Forced Alignment Strategiesmentioning
confidence: 99%
See 2 more Smart Citations
“…Also, neural network based FA strategy like NeuFA [?] and CIF [6,4] is proposed. Even Connectionist Temporal Classification (CTC) can be generally regarded as an FA strategy.…”
Section: Forced Alignment Strategiesmentioning
confidence: 99%
“…For the concern of reducing the information loss of BERT, which contains more information than wave models, the parameters of BERT are fixed. Inspired by the work [4], we use the serial CIF [6] mechanism to achieve the monotonic alignment between the speech and text modalities.…”
Section: Alignment Between Two Modalsmentioning
confidence: 99%
See 1 more Smart Citation