Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.243
|View full text |Cite
|
Sign up to set email alerts
|

To What Degree Can Language Borders Be Blurred In BERT-based Multilingual Spoken Language Understanding?

Abstract: This paper addresses the question as to what degree a BERT-based multilingual Spoken Language Understanding (SLU) model can transfer knowledge across languages. Through experiments we will show that, although it works substantially well even on distant language groups, there is still a gap to the ideal multilingual performance. In addition, we propose a novel BERT-based adversarial model architecture to learn language-shared and language-specific representations for multilingual SLU. Our experimental results p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 15 publications
0
1
0
Order By: Relevance
“…The majority of those low-resource languages lack of datasets or models and have usually not been thoroughly evaluated [ 12 ]. Moreover, the use of multilingual models cannot yet ensure similar performance with monolingual ones [ 13 ], as each language has its own unique characteristics that should be considered. Consequently, we cannot use pipelines implemented in English as-is for similar reasons.…”
Section: Introductionmentioning
confidence: 99%
“…The majority of those low-resource languages lack of datasets or models and have usually not been thoroughly evaluated [ 12 ]. Moreover, the use of multilingual models cannot yet ensure similar performance with monolingual ones [ 13 ], as each language has its own unique characteristics that should be considered. Consequently, we cannot use pipelines implemented in English as-is for similar reasons.…”
Section: Introductionmentioning
confidence: 99%