2022
DOI: 10.1038/s41598-022-06975-1
|View full text |Cite
|
Sign up to set email alerts
|

Combination predicting model of traffic congestion index in weekdays based on LightGBM-GRU

Abstract: Tree-based and deep learning methods can automatically generate useful features. Not only can it enhance the original feature representation, but it can also learn to generate new features. This paper develops a strategy based on Light Gradient Boosting Machine (LightGBM or LGB) and Gated Recurrent Unit (GRU) to generate features to improve the expression ability of limited features. Moreover, a SARIMA-GRU prediction model considering the weekly periodicity is introduced. First, LightGBM is used to learn featu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 23 publications
(12 reference statements)
0
8
0
Order By: Relevance
“…[53] stack LSTM and transfer-learning not consider Heterogeneous data sources [54] hybrid CNN-LSTM other exogenous data sources [55] Vehicular Ad hoc Networks not discuss probabilistic traffic prediction techniques and hybrid classifiers [56] bagging, boosting, stacking, and random forest ensemble models not pay attention to deep learning and missing data imputation methods. [61] Deep Belief Network (DBN) works only ETA not work on heterogeneous data sources [62] LightGBM-GRU for enhancing features list heterogeneous data sources are ignored [63] CNN not work on hybrid deep learning models and exogenous data sources [64] Support Vector Machine (SVM) and Multinomial Naïve Bayes (MNB) not work on hybrid deep learning models and exogenous data sources [65] LSTM + CNN + ATTENTION not works on heterogeneous data sources [66] is that the authors did not take into consideration the impact of heterogeneous data sources on predicting traffic congestion on road network. The authors in [37] introduced the path based deep learning framework for capturing spatialtemporal features as well as effective speed prediction.…”
Section: Methodsmentioning
confidence: 99%
“…[53] stack LSTM and transfer-learning not consider Heterogeneous data sources [54] hybrid CNN-LSTM other exogenous data sources [55] Vehicular Ad hoc Networks not discuss probabilistic traffic prediction techniques and hybrid classifiers [56] bagging, boosting, stacking, and random forest ensemble models not pay attention to deep learning and missing data imputation methods. [61] Deep Belief Network (DBN) works only ETA not work on heterogeneous data sources [62] LightGBM-GRU for enhancing features list heterogeneous data sources are ignored [63] CNN not work on hybrid deep learning models and exogenous data sources [64] Support Vector Machine (SVM) and Multinomial Naïve Bayes (MNB) not work on hybrid deep learning models and exogenous data sources [65] LSTM + CNN + ATTENTION not works on heterogeneous data sources [66] is that the authors did not take into consideration the impact of heterogeneous data sources on predicting traffic congestion on road network. The authors in [37] introduced the path based deep learning framework for capturing spatialtemporal features as well as effective speed prediction.…”
Section: Methodsmentioning
confidence: 99%
“…These strategies, along with others, contribute to LightGBM's superior computational efficiency and accuracy compared to other algorithms. Cheng W et al [30] introduced the use of LightGBM in combination with a closed recurrent unit to predict weekday traffic congestion. The objective of their study was to built a model that could effectively capture and express features limited by traditional approaches.…”
Section: Boosting Ensemble Learningmentioning
confidence: 99%
“…Reference 27 proposed LGBM and gated recurrent unit in predicting weekdays traffic congestion. The study's objective was to create a model that can improve the ability of expression of features which are limited by generating them using the proposed model.…”
Section: Literature Reviewmentioning
confidence: 99%