2020
DOI: 10.48550/arxiv.2004.14788
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Character-Level Translation with Self-attention

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 11 publications
0
1
0
Order By: Relevance
“…Chen et al [108] proposed an NMT model at different levels of granularity with a multi-level attention. Gao et al [109] found that self-attention performs very well on character-level translation.…”
Section: Open Vocabularymentioning
confidence: 99%
“…Chen et al [108] proposed an NMT model at different levels of granularity with a multi-level attention. Gao et al [109] found that self-attention performs very well on character-level translation.…”
Section: Open Vocabularymentioning
confidence: 99%