Proceedings of the 2019 Conference of the North 2019
DOI: 10.18653/v1/n19-1387
|View full text |Cite
|
Sign up to set email alerts
|

Addressing word-order Divergence in Multilingual Neural Machine Translation for extremely Low Resource Languages

Abstract: Transfer learning approaches for Neural Machine Translation (NMT) trains a NMT model on an assisting language-target language pair (parent model) which is later fine-tuned for the source language-target language pair of interest (child model), with the target language being the same. In many cases, the assisting language has a different word order from the source language. We show that divergent word order adversely limits the benefits from transfer learning when little to no parallel corpus between the source… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
37
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 30 publications
(38 citation statements)
references
References 14 publications
0
37
0
Order By: Relevance
“…The choice of a single encoder for all languages is also promoted by Hokamp et al [64], who opt for language-specific decoders. Murthy et al [101] pointed out that the sentence representations generated by the encoder are dependent on the word order of the language and are, hence, language-specific. They focused on reordering input sentences to reduce the divergence caused due to different word orders to improve the quality of transfer learning.…”
Section: Addressing Language Divergencementioning
confidence: 99%
See 1 more Smart Citation
“…The choice of a single encoder for all languages is also promoted by Hokamp et al [64], who opt for language-specific decoders. Murthy et al [101] pointed out that the sentence representations generated by the encoder are dependent on the word order of the language and are, hence, language-specific. They focused on reordering input sentences to reduce the divergence caused due to different word orders to improve the quality of transfer learning.…”
Section: Addressing Language Divergencementioning
confidence: 99%
“…Although it is important to address this issue there are surprisingly few works that address it. Murthy et al [101] showed that reducing the word order divergence between source languages by reordering the parent sentences to match child word order is beneficial in extremely low-resource scenarios. Since reordering is part of the pre-processing pipeline, it is referred to as pre-ordering.…”
Section: Syntactic Transfermentioning
confidence: 99%
“…Fine-tuning based transfer learning has been studied for transferring proper parameters Gu et al, 2018b), lexical Nguyen and Chiang, 2017;Gu et al, 2018a;Lakew et al, 2018), and syntactic (Gu et al, 2018a;Murthy et al, 2018) knowledge from a resource-rich language pair to a resource-poor language pair. On the other hand, Chu et al (2017) proposed a more robust training approach for domain adaptation, called mixed fine-tuning, which uses a mixture of data from different domains.…”
Section: Related Workmentioning
confidence: 99%
“…Murthy et al [27] exploited preordering for low-resource NMT with transfer learning. They first trained the translation model on languages with an abundant parallel corpus.…”
Section: B Usage Of Reordering Information In Nmtmentioning
confidence: 99%