2018
DOI: 10.3390/app8040630
|View full text |Cite
|
Sign up to set email alerts
|

Reconstruct Recurrent Neural Networks via Flexible Sub-Models for Time Series Classification

Abstract: Recurrent neural networks (RNNs) remain challenging, and there is still a lack of long-term memory or learning ability in sequential data classification and prediction. In this paper, we propose a flexible recurrent model, BIdirectional COnvolutional RaNdom RNNs (BICORN-RNNs), incorporating a series of sub-models: random projection, convolutional operation, and bidirectional transmission. These subcategories advance classification accuracy, which was limited by the gradient vanishing and the exploding problem.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 35 publications
0
4
0
Order By: Relevance
“…A recurrent neural network (RNN) is mainly used to learn ordered data or time-series data such as natural language processing and speech recognition [50][51][52][53][54][55][56][57][58][59]. However, RNN has the vanishing gradient problem that significantly reduces the learning ability when the distance between the previous output and the point where it uses the information from that output is far away [60,61].…”
Section: Long Short-term Memory (Lstm)mentioning
confidence: 99%
“…A recurrent neural network (RNN) is mainly used to learn ordered data or time-series data such as natural language processing and speech recognition [50][51][52][53][54][55][56][57][58][59]. However, RNN has the vanishing gradient problem that significantly reduces the learning ability when the distance between the previous output and the point where it uses the information from that output is far away [60,61].…”
Section: Long Short-term Memory (Lstm)mentioning
confidence: 99%
“…The simulation uses spatial point target data of four different shape types, including cone, cone-cylinder, ball-base cone, and curved pieces. The shape, physical property, micro-motion parameters, and sensor property parameters of various spatial targets are shown in Table 1 [4,13,22]. Considering the thermal noise, non-uniformity of the infrared sensor, etc., Gaussian additive white noise is used to describe the data deviation caused by these factors in the infrared radiation simulation to improve the authenticity of the data [22].…”
Section: Infrared Radiation Sequence Simulationmentioning
confidence: 99%
“…RNNs is an improved structure of feed-forward ANNs which has time-recurrent structures and memory ability of previous information. Moreover, RNNs algorithm has a simple structure, high computational efficiency, and low computational and storage resources [13]. However, the RNNs algorithm only focuses on local information essentially, especially when using the error back-propagation method to train the network, which inevitably limits the RNNs' grasp of the overall information of the sequence, hindering its ability to learn complex decision functions [13,14].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation