Proceedings of the 18th International Conference on Spoken Language Translation (IWSLT 2021) 2021
DOI: 10.18653/v1/2021.iwslt-1.3
|View full text |Cite
|
Sign up to set email alerts
|

NAIST English-to-Japanese Simultaneous Translation System for IWSLT 2021 Simultaneous Text-to-text Task

Abstract: This paper describes NAIST's system for the English-to-Japanese Simultaneous Text-totext Translation Task in IWSLT 2021 Evaluation Campaign. Our primary submission is based on wait-k neural machine translation with sequence-level knowledge distillation to encourage literal translation.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 15 publications
(13 reference statements)
0
0
0
Order By: Relevance
“…We closely follow previous SiMT works (Ren et al 2020;Zhang, Feng, and Li 2021;Fukuda et al 2021;Liu et al 2021a;Zhao et al 2021) to adopt sequence-level knowledge distillation (Kim and Rush 2016) for all systems. Specifically, we train a full-sentence BASE Transformer (Vaswani et al 2017) as the teacher on the original bilingual dataset, then perform beam-search decoding for the source side of the original bilingual data or newly introduced monolingual data to generate the distilled data.…”
Section: Model Trainingmentioning
confidence: 99%
“…We closely follow previous SiMT works (Ren et al 2020;Zhang, Feng, and Li 2021;Fukuda et al 2021;Liu et al 2021a;Zhao et al 2021) to adopt sequence-level knowledge distillation (Kim and Rush 2016) for all systems. Specifically, we train a full-sentence BASE Transformer (Vaswani et al 2017) as the teacher on the original bilingual dataset, then perform beam-search decoding for the source side of the original bilingual data or newly introduced monolingual data to generate the distilled data.…”
Section: Model Trainingmentioning
confidence: 99%