2020 IEEE 3rd International Conference on Automation, Electronics and Electrical Engineering (AUTEEE) 2020
DOI: 10.1109/auteee50969.2020.9315723
|View full text |Cite
|
Sign up to set email alerts
|

Traffic Flow Prediction Based on Stack AutoEncoder and Long Short-Term Memory Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(8 citation statements)
references
References 11 publications
0
8
0
Order By: Relevance
“…The reduction in the number of neurons in each layer starts from the input layer and is followed by the mirroring of the layers at the center, known as the bottleneck, to build the autoencoder’s decoding section. An AE aims to constrict the input into a lower-dimensional code and reconstruct the new representation’s output to obtain the initial picture [ 8 , 30 , 31 ]. Like the multilayer perceptron, the AE framework consists of NN with one or two hidden layers.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The reduction in the number of neurons in each layer starts from the input layer and is followed by the mirroring of the layers at the center, known as the bottleneck, to build the autoencoder’s decoding section. An AE aims to constrict the input into a lower-dimensional code and reconstruct the new representation’s output to obtain the initial picture [ 8 , 30 , 31 ]. Like the multilayer perceptron, the AE framework consists of NN with one or two hidden layers.…”
Section: Methodsmentioning
confidence: 99%
“…Compressing the input vectors into lower dimensions increases the learning efficiency [ 7 ]. AE’s input and output layers should contain the same number of neurons [ 8 , 9 ] because the AE aims to initialize the latent layer parameters that will reconstruct the multidimensional input data [ 31 ].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…An LSTM, equipped with memory cells and gating functions, models longterm dependencies and resolves vanishing gradients, offering significant advantages in time series prediction [10]. It has been widely applied in fields such as stock prediction in financial markets and short-term traffic flow prediction [11][12][13][14]. Nevertheless, standard LSTM neural networks encounter problems in time series prediction, such as a high time consumption and complexity [15].…”
Section: Related Workmentioning
confidence: 99%
“…In [181], the authors used LSTM to build a short-term traffic flow prediction model based on the driving data of private cars and minibusses. In [182], the LSTM is not only used to extract time-series information but also is combined with stack auto-encoder to extract spatial information in traffic data, thereby achieving more accurate traffic flow prediction.…”
Section: ) Intelligent Transportation Systems (Its)mentioning
confidence: 99%