Proceedings of the 55th Annual Meeting of the Association For Computational Linguistics (Volume 1: Long Papers) 2017
DOI: 10.18653/v1/p17-1176
|View full text |Cite
|
Sign up to set email alerts
|

A Teacher-Student Framework for Zero-Resource Neural Machine Translation

Abstract: While end-to-end neural machine translation (NMT) has made remarkable progress recently, it still suffers from the data scarcity problem for low-resource language pairs and domains. In this paper, we propose a method for zero-resource NMT by assuming that parallel sentences have close probabilities of generating a sentence in a third language. Based on the assumption, our method is able to train a source-to-target NMT model ("student") without parallel corpora available guided by an existing pivot-to-target NM… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
66
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 75 publications
(66 citation statements)
references
References 20 publications
(40 reference statements)
0
66
0
Order By: Relevance
“…As we mentioned in the main text of the paper, distillation (Chen et al, 2017) and pivoting yield zero-shot consistent models. Let us understand why this is the case.…”
Section: A3 Consistency Of Distillation and Pivotingmentioning
confidence: 74%
See 4 more Smart Citations
“…As we mentioned in the main text of the paper, distillation (Chen et al, 2017) and pivoting yield zero-shot consistent models. Let us understand why this is the case.…”
Section: A3 Consistency Of Distillation and Pivotingmentioning
confidence: 74%
“…We see that models trained with agreement perform comparably to Pivot, outperforming it in some cases, e.g., when the target is Russian, perhaps because it is quite 9 www.cs.cmu.edu/˜mshediva/code/ . ‡ Distillation (Chen et al, 2017). different linguistically from the English pivot.…”
Section: Training and Evaluationmentioning
confidence: 96%
See 3 more Smart Citations