2017
DOI: 10.48550/arxiv.1710.06313
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Paying Attention to Multi-Word Expressions in Neural Machine Translation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…But since n-gramming does not provide context to phrases, this approach will be of limited help in logical reasoning (Bernardy and Chatzikyriakidis 2019;Zhou et al 2020). Instead, special techniques are needed for multiword expressions (Rikters and Bojar 2017) and also for proper names (e.g., transliteration from Latin to Cyrillic characters) (Petic and G fu 2014;Mansurov and Mansurov 2021).…”
Section: N-gramming: Computational Issuesmentioning
confidence: 99%
“…But since n-gramming does not provide context to phrases, this approach will be of limited help in logical reasoning (Bernardy and Chatzikyriakidis 2019;Zhou et al 2020). Instead, special techniques are needed for multiword expressions (Rikters and Bojar 2017) and also for proper names (e.g., transliteration from Latin to Cyrillic characters) (Petic and G fu 2014;Mansurov and Mansurov 2021).…”
Section: N-gramming: Computational Issuesmentioning
confidence: 99%
“…Therefore, there is no explicit mechanism to attend to full phrases rather than subwords or characters. 16 Phrase-based NMT which equips the model with the ability to attend to full phrases or multi-word expressions has been studied by Rikters and Bojar [408], Ishiwatari et al [409], Feng et al [410], Huang et al [411], Li et al [412], Eriguchi et al [413].…”
Section: Advanced Attention Modelsmentioning
confidence: 99%