2020
DOI: 10.1002/cem.3311
|View full text |Cite
|
Sign up to set email alerts
|

Improving Harris hawks optimization algorithm for hyperparameters estimation and feature selection in v‐support vector regression based on opposition‐based learning

Abstract: Many real problems have been solved by support vector regression, especially v‐support vector regression (v‐SVR), but there are hyperparameters that usually needed to tune. In addition, v‐SVR cannot perform feature selection. Nature‐inspired algorithms have been used as a feature selection and as hyperparameters estimation procedure. In this paper, the opposition‐based learning Harris hawks optimization algorithm (HHOA‐OBL) is proposed to optimize the hyperparameters of the v‐SVR with embedding the feature sel… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(9 citation statements)
references
References 65 publications
0
9
0
Order By: Relevance
“…The research of Ismael et al (2020Ismael et al ( , 2021 proposes HHO to optimize v-SVR hyperparameters by embedding feature selection simultaneously or without separating the process. This study also uses only a few chemical datasets to see the performance of the proposed method.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The research of Ismael et al (2020Ismael et al ( , 2021 proposes HHO to optimize v-SVR hyperparameters by embedding feature selection simultaneously or without separating the process. This study also uses only a few chemical datasets to see the performance of the proposed method.…”
Section: Discussionmentioning
confidence: 99%
“…There are several related studies that have existed before. Ismael et al (2020Ismael et al ( , 2021 propose HHO to optimize v-SVR hyperparameters by embedding feature selection simultaneously or without separating the process. But in the evaluation of the model using only MSE.…”
Section: Introductionmentioning
confidence: 99%
“…To perform segmentation on satellite and oil pollution images [23] Hybrid HHO differential evolution Making two equal subpopulations from a complete one and training both the subpopulation parallelly using HHO and differential evolution Multilevel image segmentation [24] Adaptive HHO Mutation is used by HHO to clip the escape energy Multilevel image segmentation [25] Hybrid OBL-HHO OBL generates a solution for HHO through adversarial learning approach…”
Section: Fuzzy Harris Hawk Algorithmmentioning
confidence: 99%
“…HHO has been used extensively for the feature selection process to optimize the parameters for classification methods [ 69 , 70 ]. Ismael et al [ 25 ] proposed to improve the HHO method by employing an opposition-based learning approach (OBL). The OBL generates a solution for meta-heuristic algorithms through an adversarial learning approach.…”
Section: Applications Of Hhomentioning
confidence: 99%
“…e mean values of F10 and F11 are slightly superior, and the stability is also better. is section compares CCCHHO with THHO [33], MHHO [34], and OBLHHO [35]. We record the optimal value (best), mean value (avg), worst value (worst), and standard deviation (std) of each algorithm, and the results of each algorithm are shown in Table 3.…”
Section: Comparison Of Ccchho and Other Optimizationmentioning
confidence: 99%