2022
DOI: 10.48550/arxiv.2204.09269
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Survey on Non-Autoregressive Generation for Neural Machine Translation and Beyond

Abstract: Non-autoregressive (NAR) generation, which is first proposed in neural machine translation (NMT) to speed up inference, has attracted much attention in both machine learning and natural language processing communities. While NAR generation can significantly accelerate inference speed for machine translation, the speedup comes at the cost of sacrificed translation accuracy compared to its counterpart, auto-regressive (AR) generation. In recent years, many new models and algorithms have been designed/proposed to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 116 publications
(228 reference statements)
0
2
0
Order By: Relevance
“…However, due to the autoregressive nature, searching for the word at current position only considers the information from the left side, not the full context. Besides, the autoregressive nature tends to bring mode collapse problem [74], resulting in captions with less diversity. Moreover, the timecost of iterative gradient-update is high, especially for long captions.…”
Section: Bert Encodermentioning
confidence: 99%
See 1 more Smart Citation
“…However, due to the autoregressive nature, searching for the word at current position only considers the information from the left side, not the full context. Besides, the autoregressive nature tends to bring mode collapse problem [74], resulting in captions with less diversity. Moreover, the timecost of iterative gradient-update is high, especially for long captions.…”
Section: Bert Encodermentioning
confidence: 99%
“…However, such autoregressive generation often results in issues such as sequential error accumulation and lack of diversity [13,74]. Further, for zero-shot IC, the sequential searching-order is lack of flexible.…”
Section: Sampling-based Language Model For P(x <1n> )mentioning
confidence: 99%
“…Non-autoregressive (NAR) generation refers to a method of generating sequences where each element is generated independently, without relying on previously generated elements, allowing for faster parallel generation but potentially sacrificing the generation accuracy (Xiao et al, 2022). Recently, diffusion models have demonstrated powerful generative capabilities in image generation tasks, gradually becoming a new paradigm in generative models.…”
Section: Introductionmentioning
confidence: 99%