2023
DOI: 10.1007/s11440-023-01830-7
|View full text |Cite
|
Sign up to set email alerts
|

Prediction of mining induced subsidence by sparrow search algorithm with extreme gradient boosting and TOPSIS method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 56 publications
0
4
0
Order By: Relevance
“…Different numbers of particle swarms lead to different optimization results. It is generally recommended to select a swarm size between 20 and 120 [ 49 ]. In this study, swarm sizes of 20, 40, 60, 80, 100, and 120 were chosen to optimize the prediction of PPV using the RUN algorithm in combination with XGBoost.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Different numbers of particle swarms lead to different optimization results. It is generally recommended to select a swarm size between 20 and 120 [ 49 ]. In this study, swarm sizes of 20, 40, 60, 80, 100, and 120 were chosen to optimize the prediction of PPV using the RUN algorithm in combination with XGBoost.…”
Section: Resultsmentioning
confidence: 99%
“…The important parameters in XGBoost regression analysis are shown in Table 1 . The three main parameters that influence the XGBoost prediction results are num_trees , max_depth , and eta [ 49 ]. Usually, these parameters are manually adjusted during computation to obtain higher goodness of fit results.…”
Section: Methodsmentioning
confidence: 99%
“…In addition to, the foundation concept of a XGBoost algorithm is completing outputs of the weak learners sequentially for obtain better performance. Due to these advantages, its application in mining studies and earth sciences has become widespread in recent years (Sharma, 2018;Ding et al, 2020;Zhong et al, 2020;Qiu et al, 2021;Xu et al, 2023, Wang et al, 2022Hosseini et al, 2023). The XGBoost is to minimize the arranged expense Eq (1) given as below (Kavzoğlu and Teke, 2022).…”
Section: Multicollinearity Analysismentioning
confidence: 99%
“…Then in 2023, Xu et al devised a novel intelligent approach to predict mining-induced land subsidence using a new sparrow search algorithm (Xu et al, 2023). They compared five machine learning models -extreme gradient boosting, random forest, adaptive boosting (or AdaBoost), bootstrap aggregation, and gradient boosting -with their proposed model.…”
Section: Introductionmentioning
confidence: 99%