2023
DOI: 10.1609/aaai.v37i11.26497
|View full text |Cite
|
Sign up to set email alerts
|

Improving Simultaneous Machine Translation with Monolingual Data

Abstract: Simultaneous machine translation (SiMT) is usually done via sequence-level knowledge distillation (Seq-KD) from a full-sentence neural machine translation (NMT) model. However, there is still a significant performance gap between NMT and SiMT. In this work, we propose to leverage monolingual data to improve SiMT, which trains a SiMT student on the combination of bilingual data and external monolingual data distilled by Seq-KD. Preliminary experiments on En-Zh and En-Ja news domain corpora demonstrate that mono… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 34 publications
0
0
0
Order By: Relevance
“…Speech-to-Text Translation Most studies have been conducted to enhance end-to-end S2T models. propose a multi-task learning approach that jointly performs automatic speech recognition and S2T, while Liu et al (2019b) presents a knowledge distillation technique (Deng et al, 2023) by transferring knowledge from T2T models. However, previous works indicate that their successes heavily rely on large amounts of labeled training data, which is challenging to acquire.…”
Section: Related Workmentioning
confidence: 99%
“…Speech-to-Text Translation Most studies have been conducted to enhance end-to-end S2T models. propose a multi-task learning approach that jointly performs automatic speech recognition and S2T, while Liu et al (2019b) presents a knowledge distillation technique (Deng et al, 2023) by transferring knowledge from T2T models. However, previous works indicate that their successes heavily rely on large amounts of labeled training data, which is challenging to acquire.…”
Section: Related Workmentioning
confidence: 99%