2021
DOI: 10.48550/arxiv.2106.05634
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Exploring Unsupervised Pretraining Objectives for Machine Translation

Abstract: Unsupervised cross-lingual pretraining has achieved strong results in neural machine translation (NMT), by drastically reducing the need for large parallel data. Most approaches adapt masked-language modeling (MLM) to sequence-to-sequence architectures, by masking parts of the input and reconstructing them in the decoder. In this work, we systematically compare masking with alternative objectives that produce inputs resembling real (full) sentences, by reordering and replacing words based on their context. We … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 6 publications
(6 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?