Interspeech 2019 2019
DOI: 10.21437/interspeech.2019-2158
|View full text |Cite
|
Sign up to set email alerts
|

Investigating Adaptation and Transfer Learning for End-to-End Spoken Language Understanding from Speech

Abstract: This work investigates speaker adaptation and transfer learning for spoken language understanding (SLU). We focus on the direct extraction of semantic tags from the audio signal using an end-to-end neural network approach. We demonstrate that the learning performance of the target predictive function for the semantic slot filling task can be substantially improved by speaker adaptation and by various knowledge transfer approaches. First, we explore speaker adaptive training (SAT) for end-to-end SLU models and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
3
1

Relationship

2
5

Authors

Journals

citations
Cited by 31 publications
(28 citation statements)
references
References 24 publications
0
28
0
Order By: Relevance
“…phonemes, characters, wordpieces, words) with NLU-level units (e.g. intents, slots) [9,10]. Two-step training approaches have also been proposed, where the network is pretrained on large datasets using ASRlevel recognition units, and it is subsequently finetuned on the target dataset using NLU-level recognition units [7,11].…”
Section: Introductionmentioning
confidence: 99%
“…phonemes, characters, wordpieces, words) with NLU-level units (e.g. intents, slots) [9,10]. Two-step training approaches have also been proposed, where the network is pretrained on large datasets using ASRlevel recognition units, and it is subsequently finetuned on the target dataset using NLU-level recognition units [7,11].…”
Section: Introductionmentioning
confidence: 99%
“…The best result (line #4) is obtained for supervised-all hvectors and corresponds to 12.5% of relative CER reduction and to 11.9% of CVER reduction in comparison with the baseline model. It was shown in [29], that transfer learning can significantly improve the performance of end-to-end SLU models. In this work, we are also interested in exploring the proposed approach for more accurate models trained using the transfer learning paradigm.…”
Section: Resultsmentioning
confidence: 99%
“…In this work, we are also interested in exploring the proposed approach for more accurate models trained using the transfer learning paradigm. For this purpose, we trained two models using transfer learning from the ASR task as proposed in [29] and described in Section 4.3. Results for these models are presented in Table 3.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations