2022
DOI: 10.3390/math10050787
|View full text |Cite
|
Sign up to set email alerts
|

Novel Ensemble Tree Solution for Rockburst Prediction Using Deep Forest

Abstract: The occurrence of rockburst can cause significant disasters in underground rock engineering. It is crucial to predict and prevent rockburst in deep tunnels and mines. In this paper, the deficiencies of ensemble learning algorithms in rockburst prediction were investigated. Aiming at these shortages, a novel machine learning model, deep forest, was proposed to predict rockburst risk. The deep forest combines the characteristics of deep learning and ensemble models, which can solve complex problems. To develop t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
14
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1
1

Relationship

1
9

Authors

Journals

citations
Cited by 40 publications
(14 citation statements)
references
References 56 publications
0
14
0
Order By: Relevance
“…Xue et al [10] optimized the ELM by particle swarm algorithm (PSO), selecting six quantitative rockburst parameters, such as the maximum tangential stress of the surrounding rock, uniaxial compressive strength of the rock, the tensile strength of the rock, the stress ratio, the brittleness ratio of the rock and the elastic energy index, on 344 sets of rockburst cases were learned and validated on the riverside hydropower station, and the results showed that the model has a better prediction effect.Yin et al [21] collected 400 groups of cases through microseismic monitoring data, optimized the CNN-Adam-BO integration algorithm by using adaptive matrix estimation and Bayesian to CNN-Adam-BO integration algorithms, respectively, and compared the model with the continuous wavelet transform and the cross-wavelet transform, which verified the superiority of the model. Barkat Ullah et al [2] predicted the short-term rockburst database by t-SNE, K-means clustering and XGBoost, which provided a reference for subsequent research.Liu et al [22] optimized BP, PNN, and SVM by PSO and compared the prediction effect of the three models.Qiu et al [23] optimized the short-term rockburst database by Sand Cat Swarm Optimization (SCSO) on Extreme Gradient Boosting (XGBoost), established a new model for predicting short-term rockburst damage, trained with 254 sets of data from Australia and Canada, developed a graphical presentation interface, and provided guidance and direction for subsequent research.Sun et al [24] used Yeo-Johnson transform, K-means SMOTE oversampling and optimal rockburst feature dimension determination to optimize the data structure and effectively improve the accuracy of rockburst prediction.Zhou et al [25] proposed a hybrid model of PSO, HHO, and MFO optimized SVM, selected six feature parameters such as angularfrequency ratio, total energy, and so on as the input variables, and used accuracy, precision, and kappa coefficients to compare the models.Li et al [26] roposed a new prediction model (deep forest) using Bayesian for hyper-parameter tuning and showed the superiority of the model by comparing it with other models.Qiu et al [27] developed a rockburst prediction fusion model by combining multiple machine learning models with D-S evidence theory, which improved the uncertainty and poor robustness of the prediction results of a single base classifier.Sun et al [28]. designed a beetle antenna search algorithm to optimize a random forest classifier, which was found to have good accuracy over single models and empirical formulas.Zhou et al [29].…”
Section: Introductionmentioning
confidence: 99%
“…Xue et al [10] optimized the ELM by particle swarm algorithm (PSO), selecting six quantitative rockburst parameters, such as the maximum tangential stress of the surrounding rock, uniaxial compressive strength of the rock, the tensile strength of the rock, the stress ratio, the brittleness ratio of the rock and the elastic energy index, on 344 sets of rockburst cases were learned and validated on the riverside hydropower station, and the results showed that the model has a better prediction effect.Yin et al [21] collected 400 groups of cases through microseismic monitoring data, optimized the CNN-Adam-BO integration algorithm by using adaptive matrix estimation and Bayesian to CNN-Adam-BO integration algorithms, respectively, and compared the model with the continuous wavelet transform and the cross-wavelet transform, which verified the superiority of the model. Barkat Ullah et al [2] predicted the short-term rockburst database by t-SNE, K-means clustering and XGBoost, which provided a reference for subsequent research.Liu et al [22] optimized BP, PNN, and SVM by PSO and compared the prediction effect of the three models.Qiu et al [23] optimized the short-term rockburst database by Sand Cat Swarm Optimization (SCSO) on Extreme Gradient Boosting (XGBoost), established a new model for predicting short-term rockburst damage, trained with 254 sets of data from Australia and Canada, developed a graphical presentation interface, and provided guidance and direction for subsequent research.Sun et al [24] used Yeo-Johnson transform, K-means SMOTE oversampling and optimal rockburst feature dimension determination to optimize the data structure and effectively improve the accuracy of rockburst prediction.Zhou et al [25] proposed a hybrid model of PSO, HHO, and MFO optimized SVM, selected six feature parameters such as angularfrequency ratio, total energy, and so on as the input variables, and used accuracy, precision, and kappa coefficients to compare the models.Li et al [26] roposed a new prediction model (deep forest) using Bayesian for hyper-parameter tuning and showed the superiority of the model by comparing it with other models.Qiu et al [27] developed a rockburst prediction fusion model by combining multiple machine learning models with D-S evidence theory, which improved the uncertainty and poor robustness of the prediction results of a single base classifier.Sun et al [28]. designed a beetle antenna search algorithm to optimize a random forest classifier, which was found to have good accuracy over single models and empirical formulas.Zhou et al [29].…”
Section: Introductionmentioning
confidence: 99%
“…They were inspired by the principles of biological evolution and some physical phenomena. Among the mainstream intelligent optimization algorithms are PSO [16,42,43], BP neural networks [44,45], Bayesian networks [46,47], random forests [48,49], genetic algorithms [50], support vector machines [51], decision tree models [52,53], and so on. These algorithms have shown excellent performances in handling complex problems and exhibit high computational efficiency.…”
Section: Introductionmentioning
confidence: 99%
“…The long and short term memory network can accurately capture the internal relationship between the front and back elements in the time series data, and form short-term memory by forgetting the front elements to guide the back elements, while retaining the guidelines to form longterm memory [20,21]. In the deep random forest, key variables are found and sorted through the input data through the multi-grain scanning process, features are captured according to the sliding window, and features are fully captured and processed data are recorded in the cascade forest process [22,23]. Transformer is a kind of neural network with self-attention mechanism, which can use time series data as the input of encoder in Transformer model and predict future values in an autoregressive way in the decoder part [24,25].…”
Section: Introductionmentioning
confidence: 99%