2018
DOI: 10.1007/978-3-319-96133-0_26
|View full text |Cite
|
Sign up to set email alerts
|

A Hybrid Neural Machine Translation Technique for Translating Low Resource Languages

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 9 publications
0
6
0
Order By: Relevance
“…Third, the current study focuses on human translation. Considering the recent success of neural machine translation ( Almansor and Al-Ani, 2018 ; Islam et al, 2021 ), it will contribute further to translation performance research if different task types (i.e., human translation, and post-editing of neural machine translation) are taken into account. Future studies could diversify the design of task features (e.g., task type) and select participants with different language pairs and diverse education backgrounds, so as to explore further the relationships between variables in task complexity, learner factors, and translation performance with larger samples.…”
Section: Discussionmentioning
confidence: 99%
“…Third, the current study focuses on human translation. Considering the recent success of neural machine translation ( Almansor and Al-Ani, 2018 ; Islam et al, 2021 ), it will contribute further to translation performance research if different task types (i.e., human translation, and post-editing of neural machine translation) are taken into account. Future studies could diversify the design of task features (e.g., task type) and select participants with different language pairs and diverse education backgrounds, so as to explore further the relationships between variables in task complexity, learner factors, and translation performance with larger samples.…”
Section: Discussionmentioning
confidence: 99%
“…Almansor and Al-Ani [70] presented a character-based hybrid NMT model that combines both RNN and CNN networks. They trained their model on a very small portion of the TED parallel corpora containing only 90K sentence pairs, notably IWSLT 2016 Arabic-English.…”
Section: ) Low-resourcementioning
confidence: 99%
“…Our second finding tells us the capability of MMPLMs in generating a new language pair knowledge space for translating clinical domain text even though this language pair was unseen in the pre-training stage with our experimental settings. This can be useful to low-resource NLP, such as the work by ( 26 , 27 ). 2…”
Section: Introductionmentioning
confidence: 99%