2019
DOI: 10.5281/zenodo.3525024
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of Positional Encodings for Neural Machine Translation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 6 publications
0
1
0
Order By: Relevance
“…RPE outperforms APE on out-of-distribution data in terms of sequence length owing to its innate shift invariance (Rosendahl et al, 2019;Neishi and Yoshinaga, 2019;Narang et al, 2021;Wang et al, 2021). However, the self-attention mechanism of RPE involves more computation than that of APE 4 .…”
Section: Relative Position Embedding (Rpe)mentioning
confidence: 99%
“…RPE outperforms APE on out-of-distribution data in terms of sequence length owing to its innate shift invariance (Rosendahl et al, 2019;Neishi and Yoshinaga, 2019;Narang et al, 2021;Wang et al, 2021). However, the self-attention mechanism of RPE involves more computation than that of APE 4 .…”
Section: Relative Position Embedding (Rpe)mentioning
confidence: 99%