Proceedings of the Fourth Conference on Machine Translation (Volume 3: Shared Task Papers, Day 2) 2019
DOI: 10.18653/v1/w19-5421
|View full text |Cite
|
Sign up to set email alerts
|

UCAM Biomedical Translation at WMT19: Transfer Learning Multi-domain Ensembles

Abstract: The 2019 WMT Biomedical translation task involved translating Medline abstracts. We approached this using transfer learning to obtain a series of strong neural models on distinct domains, and combining them into multidomain ensembles. We further experiment with an adaptive language-model ensemble weighting scheme. Our submission achieved the best submitted results on both directions of English-Spanish.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 11 publications
(12 citation statements)
references
References 13 publications
0
9
0
1
Order By: Relevance
“…Another parameter-sharing scheme is in Jiang et al (2019), which augments a Transformer model with domain-specific heads, whose contributions are regulated at the word/position level: Some words have ''generic'' use and rely on mixed-domain heads, whereas for some other words it is preferable to use domainspecific heads, thereby reintroducing the idea of ensembling at the core of Huck et al (2015) and Saunders et al (2019). The results for three language pairs outperform several standard baselines for a two-domain systems (in fr:en and de:en) and a four-domain system (zh:en).…”
Section: Related Workmentioning
confidence: 99%
“…Another parameter-sharing scheme is in Jiang et al (2019), which augments a Transformer model with domain-specific heads, whose contributions are regulated at the word/position level: Some words have ''generic'' use and rely on mixed-domain heads, whereas for some other words it is preferable to use domainspecific heads, thereby reintroducing the idea of ensembling at the core of Huck et al (2015) and Saunders et al (2019). The results for three language pairs outperform several standard baselines for a two-domain systems (in fr:en and de:en) and a four-domain system (zh:en).…”
Section: Related Workmentioning
confidence: 99%
“…UCAM (Saunders et al, 2019). The UCAM team relied on transfer learning and used the Tensor2Tensor implementation of the Transformer model.…”
Section: Oommentioning
confidence: 99%
“…In the last two WMT biomedical translation challenges (WMT18 and WMT19) (Neves et al, 2018;Bawden et al, 2019), the submissions that achieved the best BLEU scores for the ES/EN and PT/EN, in both directions (Soares and Becker, 2018;Tubay and Costa-Jussà, 2018;Carrino et al, 2019;Saunders et al, 2019;, used the Transformer architecture with enhancements such as handling of terminology during tokenization (Carrino et al, 2019), multi-domain inference (Saunders et al, 2019) and exploitation of additional linguistic resources (Soares and Becker, 2018;.…”
Section: Related Workmentioning
confidence: 99%