2020
DOI: 10.48550/arxiv.2008.06208
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Adaptable Multi-Domain Language Model for Transformer ASR

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
1
0
Order By: Relevance
“…In addition, one hot encoding of ground-truth domain information is also used when it is available during inference [2], [3], [7]. As for model based approaches, extra components, such as linear hidden network (LHN) [11], learning hidden unit contributions (LHUC) [12], [13], or adapters [14], are added to basic models to learn domain-specific information [7], [14]. Unfortunately, these approaches cannot be directly applied to our case since new songs may release at any time.…”
Section: Prior Workmentioning
confidence: 99%
“…In addition, one hot encoding of ground-truth domain information is also used when it is available during inference [2], [3], [7]. As for model based approaches, extra components, such as linear hidden network (LHN) [11], learning hidden unit contributions (LHUC) [12], [13], or adapters [14], are added to basic models to learn domain-specific information [7], [14]. Unfortunately, these approaches cannot be directly applied to our case since new songs may release at any time.…”
Section: Prior Workmentioning
confidence: 99%
“…In addition, one hot encoding of ground-truth domain information is also used when it is available during inference [2], [3], [7]. As for model based approaches, extra components, such as linear hidden network (LHN) [11], learning hidden unit contributions (LHUC) [12], [13], or adapters [14], are added to basic models to learn domain-specific information [7], [14]. Unfortunately, these approaches cannot be directly applied to our case since new songs may release at any time.…”
Section: Prior Workmentioning
confidence: 99%