Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1129
|View full text |Cite
|
Sign up to set email alerts
|

Zero-shot Cross-lingual Dialogue Systems with Transferable Latent Variables

Abstract: Despite the surging demands for multilingual task-oriented dialog systems (e.g., Alexa, Google Home), there has been less research done in multilingual or cross-lingual scenarios. Hence, we propose a zero-shot adaptation of task-oriented dialogue system to lowresource languages. To tackle this challenge, we first use a set of very few parallel word pairs to refine the aligned cross-lingual wordlevel representations. We then employ a latent variable model to cope with the variance of similar sentences across di… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
53
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3
2

Relationship

2
8

Authors

Journals

citations
Cited by 60 publications
(55 citation statements)
references
References 17 publications
0
53
0
Order By: Relevance
“…This work has also empirically validated that there is still ample room for improvement in the intent detection task especially in low-data regimes. Therefore, similar to recent work (Upadhyay et al, 2018;Khalil et al, 2019;Liu et al, 2019c), we will also investigate how to transfer intent detectors to low-resource target languages in few-shot and zero-shot scenarios. We also plan to extend the models to handle out-of-scope prediction (Larson et al, 2019).…”
Section: Discussionmentioning
confidence: 94%
“…This work has also empirically validated that there is still ample room for improvement in the intent detection task especially in low-data regimes. Therefore, similar to recent work (Upadhyay et al, 2018;Khalil et al, 2019;Liu et al, 2019c), we will also investigate how to transfer intent detectors to low-resource target languages in few-shot and zero-shot scenarios. We also plan to extend the models to handle out-of-scope prediction (Larson et al, 2019).…”
Section: Discussionmentioning
confidence: 94%
“…Recently, several task-oriented dialogue models are introduced to tackle the resource scarcity challenges in target domains (Bapna et al, 2017;Wu et al, 2019a; and target languages (Mrkšić et al, 2017;Schuster et al, 2019;Chen et al, 2018;Liu et al, 2019b), and large pre-trained language models are shown to possess the capability to quickly adapt to taskoriented dialogue tasks by using only a few data samples (Peng et al, 2020b;Madotto et al, 2020b;.…”
Section: Related Workmentioning
confidence: 99%
“…Sachan et al (2018); Jia, Liang, and Zhang (2019) injected target domain knowledge into language models for the fast adaptation, and Jia and Zhang (2020) presented a multi-cell compositional network for NER domain adaptation. Additionally, fast adaptation algorithms have been applied to low-resource languages (Lample and Conneau 2019; Liu et al 2019Liu et al , 2020aWilie et al 2020), accents (Winata et al 2020), and machine translation (Artetxe et al 2018;Lample et al 2018).…”
Section: Related Workmentioning
confidence: 99%