2021
DOI: 10.48550/arxiv.2112.13960
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Preordered RNN Layer Boosts Neural Machine Translation in Low Resource Settings

Abstract: Neural Machine Translation (NMT) models are strong enough to convey semantic and syntactic information from the source language to the target language. However, these models are suffering from the need for a large amount of data to learn the parameters. As a result, for languages with scarce data, these models are at risk of underperforming. We propose to augment attention based neural network with reordering information to alleviate the lack of data. This augmentation improves the translation quality for both… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 11 publications
(13 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?