2021
DOI: 10.1007/s00366-021-01393-9
|View full text |Cite
|
Sign up to set email alerts
|

Performance evaluation of hybrid WOA-XGBoost, GWO-XGBoost and BO-XGBoost models to predict blast-induced ground vibration

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
46
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 187 publications
(83 citation statements)
references
References 70 publications
0
46
0
1
Order By: Relevance
“…This usually occurs when the ANN model begins "to memorize the training set instead of learning them and consequently loses the ability to generalize" [48]. The methods proposed for resolving it include early stopping, noise injection, cross-validation, Bayesian regularization, and the optimization approximation algorithm [48,100,101]. Paneiro et al [102] employed bilevel optimization to avoid overfitting and reduce the complexity of an ANNbased ground vibration model.…”
Section: Ground Vibrationmentioning
confidence: 99%
“…This usually occurs when the ANN model begins "to memorize the training set instead of learning them and consequently loses the ability to generalize" [48]. The methods proposed for resolving it include early stopping, noise injection, cross-validation, Bayesian regularization, and the optimization approximation algorithm [48,100,101]. Paneiro et al [102] employed bilevel optimization to avoid overfitting and reduce the complexity of an ANNbased ground vibration model.…”
Section: Ground Vibrationmentioning
confidence: 99%
“…Adding the new sample output values predicted by each tree is the final predicted value of the sample. Unlike the commonly used gradient lifting decision tree [53], which only uses the initial derivative information during optimization, XGB carries out a second-order Taylor expansion of the cost function, and uses the first and second derivatives at the same time so that XGB has good results [54].…”
Section: Extreme Gradient Boosting (Xgb)mentioning
confidence: 99%
“…It has many unique advantages for solving small-sample, nonlinear, and high-dimensional pattern recognition problems, and overcomes the problems of "dimension disaster" and "over-learning" to a great extent. It has been widely used in pattern recognition, function estimation, regression analysis, time series prediction, and other fields [53,54].…”
Section: Support Vector Machine (Svm)mentioning
confidence: 99%
“…The coefficient of determination (R 2 ) value is an index of the relationship between the actual and predicted pillar strength, which has been used as another index to evaluate SVM model. A zero value of MSE and a value of 1.0 for R 2 are required in order to have a perfect model based on performance prediction (Li et al 2020c;Qiu et al 2021;Zhou et al 2021c). As results of SVM model, the strength of pillars is estimated with R 2 of 0.96 for using training and testing dataset.…”
Section: Support Vector Machine (Svm)mentioning
confidence: 99%