Proceedings of the 13th International Joint Conference on Natural Language Processing and the 3rd Conference of the Asia-Pacifi 2023
DOI: 10.18653/v1/2023.ijcnlp-main.47
|View full text |Cite
|
Sign up to set email alerts
|

Only 5% Attention Is All You Need: Efficient Long-range Document-level Neural Machine Translation

Zihan Liu,
Zewei Sun,
Shanbo Cheng
et al.

Abstract: Document-level Neural Machine Translation (DocNMT) has been proven crucial for handling discourse phenomena by introducing document-level context information. One of the most important directions is to input the whole document directly to the standard Transformer model. In this case, efficiency becomes a critical concern due to the quadratic complexity of the attention module. Existing studies either focus on the encoder part, which cannot be deployed on sequence-to-sequence generation tasks, e.g., Machine Tra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 22 publications
(47 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?