2021
DOI: 10.1016/j.ijepes.2021.106830
|View full text |Cite
|
Sign up to set email alerts
|

Short-term load forecasting of industrial customers based on SVMD and XGBoost

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
44
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 120 publications
(44 citation statements)
references
References 33 publications
0
44
0
Order By: Relevance
“…It is an ensemble model based on decision trees that combine multiple weak learners into strong learners through multiple iterative learning processes. It works by boosting numerous weak learners such as regression trees by assembling them to create a single but stronger learner [36]. The basic principle behind the process is to learn at each iteration sequentially, and the present regression tree is fitted with the residual from the previous three.…”
Section: B: Xgboost Regression Model and Hyperparameter Optimizationmentioning
confidence: 99%
See 2 more Smart Citations
“…It is an ensemble model based on decision trees that combine multiple weak learners into strong learners through multiple iterative learning processes. It works by boosting numerous weak learners such as regression trees by assembling them to create a single but stronger learner [36]. The basic principle behind the process is to learn at each iteration sequentially, and the present regression tree is fitted with the residual from the previous three.…”
Section: B: Xgboost Regression Model and Hyperparameter Optimizationmentioning
confidence: 99%
“…where ŷXi denotes the predicted value of the i-th sample, M denotes the number of CART in the model, f m (x i ) represents the predicted value of the i-th sample in the mth tree, F is the function space of CART. The objective function of the XGBoost includes the MSE loss function and the regularisation term represented by (5) [36].…”
Section: B: Xgboost Regression Model and Hyperparameter Optimizationmentioning
confidence: 99%
See 1 more Smart Citation
“…Similarly to Random Forest, XGBoost models are based on ensemble learning theory which combine gradient boosting decision trees models and second-order Taylor expansion on the loss function to speed up the optimisation process while avoiding overfitting [64]. XGBoost models also support parallel processing and therefore are faster to train and deploy than traditional decision trees.…”
Section: Machine Learning Modelsmentioning
confidence: 99%
“…The Light Gradient Boosting Machine (LightGBM), XGBoost, Random forest, support vector regression (SVR), and Seasonal Autoregressive Integrated Moving Average (SARIMA) algorithm are used to build models. These methods have been widely used in recent studies [55][56][57][58][59]. The parameter optimization is conducted for each algorithm model for building the optimal models.…”
Section: Performance Measuresmentioning
confidence: 99%