2022
DOI: 10.48550/arxiv.2211.08547
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

ALIGN-MLM: Word Embedding Alignment is Crucial for Multilingual Pre-training

Abstract: Multilingual pre-trained models exhibit zeroshot cross-lingual transfer, where a model finetuned on a source language achieves surprisingly good performance on a target language. While studies have attempted to understand transfer, they focus only on MLM, and the large number of differences between natural languages makes it hard to disentangle the importance of different properties. In this work, we specifically highlight the importance of word embedding alignment by proposing a pretraining objective (ALIGN-M… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 20 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?