Proceedings of the Third Conference on Machine Translation: Shared Task Papers 2018
DOI: 10.18653/v1/w18-6447
|View full text |Cite
|
Sign up to set email alerts
|

Hunter NMT System for WMT18 Biomedical Translation Task: Transfer Learning in Neural Machine Translation

Abstract: This paper describes the submission of Hunter Neural Machine Translation (NMT) to the WMT'18 Biomedical translation task from English to French. The discrepancy between training and test data distribution brings a challenge to translate text in new domains. Beyond the previous work of combining in-domain with out-of-domain models, we found accuracy and efficiency gain in combining different in-domain models. We conduct extensive experiments on NMT with transfer learning. We train on different in-domain Biomedi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 11 publications
0
9
0
1
Order By: Relevance
“…Previous work on transfer learning typically aims to find a single model that performs well on a known domain of interest (Khan et al, 2018).…”
Section: Adaptive Decodingmentioning
confidence: 99%
See 1 more Smart Citation
“…Previous work on transfer learning typically aims to find a single model that performs well on a known domain of interest (Khan et al, 2018).…”
Section: Adaptive Decodingmentioning
confidence: 99%
“…Transfer learning is an approach in which a model is trained using knowledge from an existing model (Khan et al, 2018). Transfer learning typically involves initial training on a large, general domain corpus, followed by fine-tuning on the domain of interest.…”
Section: Introductionmentioning
confidence: 99%
“…The resultant BLEU scores did not improve more than 0.5. Khan et al (2018) trained three NMT systems with different corpus grouping. One experiment included only in-domain corpus, whereas two experiments were performed to train in-domain corpus by initializing the training parameters from general domain system.…”
Section: Related Workmentioning
confidence: 99%
“…To overcome this challenge various studies explore numerous techniques to improve NMT quality especially in low resource settings. Domain adaptation (Freitag and Al-Onaizan, 2016), transfer learning (Zoph et al, 2016;Khan et al, 2018), fine tuning (Dakwale and Monz, 2017;Huck et al, 2018) and data selective training (van der Wees et al, 2017); are few terms being interchangeably used for such techniques as reported in the literature.…”
Section: Introductionmentioning
confidence: 99%
“…We report SARI scores as (re)computed byMallinson et al (2020) for all systems in Table8to ensure comparability.8 Multi-stage fine-tuning has been proven effective for other sequence tasks such as machine translation(Khan et al, 2018;Saunders et al, 2019).…”
mentioning
confidence: 99%