2020
DOI: 10.1007/978-981-15-2449-3_17
|View full text |Cite
|
Sign up to set email alerts
|

Deep Recurrent Neural Network (Deep-RNN) for Classification of Nonlinear Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 22 publications
0
3
0
1
Order By: Relevance
“…Dengan berbagai masalah yang dapat dicangkup, pendekatan arsitektur neural network turut mengalami perkembangan pada permasalahan yang spesifik. Klasifikasi dan rekognisi pola (classification and pattern recognition) pada serangkaian data (data series) merupakan salah satu pendekatan yang dimuat oleh neural network [10], [11]. Penerapan recurrent neural network (RNN) dapat dijadikan pendekatan pada permasalahan data series, menimbang kemampuan aristektur tidak hanya melihat setiap data point sebagai bagian yang terpisah, namun juga meninjau hubungan antar data point-nya [12].…”
Section: B Machine Learning Neural Networkunclassified
“…Dengan berbagai masalah yang dapat dicangkup, pendekatan arsitektur neural network turut mengalami perkembangan pada permasalahan yang spesifik. Klasifikasi dan rekognisi pola (classification and pattern recognition) pada serangkaian data (data series) merupakan salah satu pendekatan yang dimuat oleh neural network [10], [11]. Penerapan recurrent neural network (RNN) dapat dijadikan pendekatan pada permasalahan data series, menimbang kemampuan aristektur tidak hanya melihat setiap data point sebagai bagian yang terpisah, namun juga meninjau hubungan antar data point-nya [12].…”
Section: B Machine Learning Neural Networkunclassified
“…In the RNN, not only the final output but also the output of each time step is used to compute an error and the error in each time step is back-propagated to the network. This technique is called "Backpropagation Through Time" or BPTT [42], [43], [44], [45]. Furthermore, the RNN faces unique challenges with time series and sequential data.…”
Section: B Related Work and Topicsmentioning
confidence: 99%
“…In contrast, in a usual RNN, the error at each time step is computed and backpropagated to the network. This type of backpropagation in an RNN is called Backpropagation Through Time (BPTT) as explained in the introduction section [42], [43], [44]. Because our RNN works over iteration rather than time, we call this framework Backpropagation Through Iteration (BPTI) to avoid confusion in this paper.…”
Section: Backpropagation Through Iteration (Bpti)mentioning
confidence: 99%
“…Considering the correlation between words in the text, the authors in [20] proposed the latent Dirichlet allocation model (LDA), which added the Dirichlet prior distribution to the polynomial distribution of texts, topics, and words. Sentence-based approaches, such as convolutional neural networks (CNNs) [21], [22] and recurrent neural networks (RNNs) [23], [24], presented the numerical representation of text at the sentence level, which was effectively adopted in natural language processing (NLP) [25], [26]. However, the CNN and RNN input was based on the word vector generated by word embedding methods.…”
Section: A Text Miningmentioning
confidence: 99%