2016 31st Youth Academic Annual Conference of Chinese Association of Automation (YAC) 2016
DOI: 10.1109/yac.2016.7804912
|View full text |Cite
|
Sign up to set email alerts
|

Using LSTM and GRU neural network methods for traffic flow prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
472
0
13

Year Published

2018
2018
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 1,002 publications
(487 citation statements)
references
References 8 publications
2
472
0
13
Order By: Relevance
“…Model accuracy is the opposite measure of the MAPE values. Equation 16 represents the formula of accuracy of the model. For each ATR, we compare 21 different models based on the type of missing data treatment method and the variety of transfer functions used in the repeater block of the RNN model.…”
Section: Analysis and Resultsmentioning
confidence: 99%
“…Model accuracy is the opposite measure of the MAPE values. Equation 16 represents the formula of accuracy of the model. For each ATR, we compare 21 different models based on the type of missing data treatment method and the variety of transfer functions used in the repeater block of the RNN model.…”
Section: Analysis and Resultsmentioning
confidence: 99%
“…Notably, an influence function was designed to help recognize congestion sources. Two RNN models, namely, LSTM and gated recurrent units, were used by Fu, Zhang, and Li () to predict short‐term traffic flow. It should be noted that the data of these online hailing vehicles contain both spatial and temporal features that are often neglected in traffic state prediction, leading to a waste of valuable information.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In [17] the authors proposed the use of LSTM and GRU networks for forecasting of traffic flow time series, comparing the results achieved with an ARIMA model. The LSTM and GRU results are better than ARIMA for this type of time series.…”
Section: Long Short-term Memory (Lstm)mentioning
confidence: 99%