2021
DOI: 10.11591/ijece.v11i3.pp2315-2326
|View full text |Cite
|
Sign up to set email alerts
|

A systematic review on sequence-to-sequence learning with neural network and its models

Abstract: We develop a precise writing survey on sequence-to-sequence learning with neural network and its models. The primary aim of this report is to enhance the knowledge of the sequence-to-sequence neural network and to locate the best way to deal with executing it. Three models are mostly used in sequence-to-sequence neural network applications, namely: recurrent neural networks (RNN), connectionist temporal classification (CTC), and attention model. The evidence we adopted in conducting this survey included utiliz… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
16
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 25 publications
(22 citation statements)
references
References 33 publications
0
16
0
Order By: Relevance
“…It is an evidence-based minimum set of elements for systematic review reports that are intended to assist systematic reviewers in clearly explaining why the review was conducted and what the authors performed. It has previously been used to target comparable research objectives [21], [22].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…It is an evidence-based minimum set of elements for systematic review reports that are intended to assist systematic reviewers in clearly explaining why the review was conducted and what the authors performed. It has previously been used to target comparable research objectives [21], [22].…”
Section: Methodsmentioning
confidence: 99%
“…References Not consider other factors (21) [25], [26], [30], [31], [32], [38], [43], [44], [45], [46], [47], [48], [49], [50], [51], [52], [53], [54], [55], [56], [57] Convenience sampling (19) [32], [35], [37], [40], [43], [45], [46], [50], [52], [53], [55], [57], [58], [59], [60], [61], [62], [63],…”
Section: Limitationsmentioning
confidence: 99%
“…Additionally, it is worth noting that the purpose of this work is not to propose a new sequence-to-sequence learning model, but to use a sequenceto-sequence learning model to assist CARP solving and to verify that the use of sequence-to-sequence learning can help speed up the solving. There are many well-established models that are available to address the resultant task [32]- [34] and it is believed that the performance may be further improved with more advanced methods.…”
Section: B the Proposed Solvermentioning
confidence: 99%
“…Recurrent Neural Networks) [15], [59], [72]- [74], and static (e.g., Convolutional Neural Network) models. It was first introduced in sequenceto-sequence deep learning models, and widely applied for machine translation task, speech recognition, and time series analysis [75].…”
Section: Encoder-decoder Deep Learning Architecturesmentioning
confidence: 99%