2022
DOI: 10.1007/s11440-022-01450-7
|View full text |Cite
|
Sign up to set email alerts
|

Predicting tunnel squeezing using support vector machine optimized by whale optimization algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
17
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 80 publications
(17 citation statements)
references
References 85 publications
0
17
0
Order By: Relevance
“…On the testing dataset, the KNN's forecasting outcomes were obtained. Various performance indices including precision, recall, and F1-score have been employed by the researchers to evaluate the performance of a classi cation model [77]. In this study, precision, recall, and F1-score have been used to predict the outcomes of the proposed KNN algorithm when associated with ISOMAP and FCM.…”
Section: Resultsmentioning
confidence: 99%
“…On the testing dataset, the KNN's forecasting outcomes were obtained. Various performance indices including precision, recall, and F1-score have been employed by the researchers to evaluate the performance of a classi cation model [77]. In this study, precision, recall, and F1-score have been used to predict the outcomes of the proposed KNN algorithm when associated with ISOMAP and FCM.…”
Section: Resultsmentioning
confidence: 99%
“…Many ensemble tree models, i.e., GBM, XGB, and RF, perform better than other ML models. Taylor diagrams [58,59] were introduced to determine the strength of the DF model compared to other models. In this study, Taylor diagrams combine the Matthews correlation coefficient (MCC), centered root mean square error (green dotted lines in Figure 11), and standard deviation into a polar diagram.…”
Section: Model Performance Comparisonmentioning
confidence: 99%
“…One of the typical examples is the work on estimating the residential building energy consumption by Tabrizchi et al [19] where a multi-verse optimizer is employed for tuning for -SVR with cross-validation. Considering actual applications, researchers have searched for the tuning in -SVR [18] with meta-heuristic algorithms, such as moth flame optimization (MFO) [20], whale optimization algorithm (WOA) [21], grey wolf optimizer (GWO) [22], grasshopper optimization algorithm (GOA) [23], flower pollination algorithm (FPA) [24], differential evolution [25], and particle swarm optimization [26]. This kind of combined method based on cross-validation often requires high computational costs to obtain a good optimum for the insensitivity parameter.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In addition, the 10-CV [9] was applied in the insensitivity parameter selection with the same alternative parameter settings as the former simulations. In addition, according to our literature review, we employed three recent meta-heuristics method with 10-CV to tune the insensitivity parameter for the -SVR: whale optimization algorithm (WOA) [21], grey wolf optimizer (GWO) [22], multi-verse optimizer (MVO) [19] with 10 search agents. In addition, all algorithms are performed on an Intel i7-8700 CPU with 16.0 GB of RAM.…”
Section: Case Studiesmentioning
confidence: 99%