2020
DOI: 10.1109/tnnls.2019.2957276
|View full text |Cite
|
Sign up to set email alerts
|

Neural Machine Translation With GRU-Gated Attention Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
43
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 93 publications
(43 citation statements)
references
References 23 publications
0
43
0
Order By: Relevance
“…On the one hand, the similarity of English words can help AI quickly understand a kind of similar text information, but on the other hand, if there is a large error in the recognition of English word similarity, it will also affect the recognition of English words and text by AI [3]. is task is also widely used in machine translation [4]. In addition, nonnative language learners also need to understand and learn similar English words when learning English, which needs the help of an English dictionary [5].…”
Section: Introductionmentioning
confidence: 99%
“…On the one hand, the similarity of English words can help AI quickly understand a kind of similar text information, but on the other hand, if there is a large error in the recognition of English word similarity, it will also affect the recognition of English words and text by AI [3]. is task is also widely used in machine translation [4]. In addition, nonnative language learners also need to understand and learn similar English words when learning English, which needs the help of an English dictionary [5].…”
Section: Introductionmentioning
confidence: 99%
“…We employed the encoder structure of the seq2seq model [24] here as the instance feature extractor. The embedding layer [25] was employed to represent bases (15 (A, T, G, C, N, H, B, D, V, R, M, S, W, Y, K) → 4 (representative dimension)) ∘ The encoder used a bi-directional RNN structure, which given equal attention to the head and the tail of the instance, and the output was a context vector [26] to represent the feature of the instance. And subsequently, through the MIL layer, the features of all instances were scored and aggregated jointly to determine the type of the bag [20, 21, 27] .…”
Section: Methodsmentioning
confidence: 99%
“…With the development of neural networks, the birth of neural machine translation technology has opened up a new path in the field of machine translation [4]. e emergence of neural machine translation technology has solved many of the abovementioned shortcomings of statistical machine translation.…”
Section: Related Workmentioning
confidence: 99%