Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing 2017
DOI: 10.18653/v1/d17-1155
|View full text |Cite
|
Sign up to set email alerts
|

Instance Weighting for Neural Machine Translation Domain Adaptation

Abstract: Instance weighting has been widely applied to phrase-based machine translation domain adaptation. However, it is challenging to be applied to Neural Machine Translation (NMT) directly, because NMT is not a linear model. In this paper, two instance weighting technologies, i.e., sentence weighting and domain weighting with a dynamic weight learning strategy, are proposed for NMT domain adaptation. Empirical results on the IWSLT English-German/French tasks show that the proposed methods can substantially improve … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
85
1

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 128 publications
(90 citation statements)
references
References 13 publications
1
85
1
Order By: Relevance
“…Meanwhile, applying data weighting into NMT domain adaptation has attracted much attention. Wang et al (2017a) and Wang et al (2017b) proposed several sentence and domain weighting methods with a dynamic weight learning strategy. Zhang et al (2019a) ranked unlabeled domain training samples based on their similarity to in-domain data, and then adopts a probabilistic curriculum learning strategy during training.…”
Section: Related Workmentioning
confidence: 99%
“…Meanwhile, applying data weighting into NMT domain adaptation has attracted much attention. Wang et al (2017a) and Wang et al (2017b) proposed several sentence and domain weighting methods with a dynamic weight learning strategy. Zhang et al (2019a) ranked unlabeled domain training samples based on their similarity to in-domain data, and then adopts a probabilistic curriculum learning strategy during training.…”
Section: Related Workmentioning
confidence: 99%
“…They modify NMT model to also accept the marked target sentence as input and train it to produce similar sentences that do not contain marked words. Wang et al, 2017) have proposed sentence level weighting method for domain adaptation in NMT.…”
Section: Related Workmentioning
confidence: 99%
“…Our training objective (2) can be seen as a generalization of the original training objective (1) and instance weighting methods [19,20]. The original loss (1) sets w t = 1 for every word in all sentences.…”
Section: Objectivementioning
confidence: 99%
“…Mixed fine-tuning [18] combines fine-tuning and multi-domain NMT. Similar to instance weighting in SMT, sentence/domain weighting methods [19,20] can also be used for NMT domain adaptation based on different objectives. DA with meta information [21] is proposed to train topic-aware models using domain-specific tags for the decoder.…”
Section: Introductionmentioning
confidence: 99%