2022
DOI: 10.1007/s42235-022-00297-8
|View full text |Cite
|
Sign up to set email alerts
|

Boosting Whale Optimizer with Quasi-Oppositional Learning and Gaussian Barebone for Feature Selection and COVID-19 Image Segmentation

Abstract: Whale optimization algorithm (WOA) tends to fall into the local optimum and fails to converge quickly in solving complex problems. To address the shortcomings, an improved WOA (QGBWOA) is proposed in this work. First, quasi-opposition-based learning is introduced to enhance the ability of WOA to search for optimal solutions. Second, a Gaussian barebone mechanism is embedded to promote diversity and expand the scope of the solution space in WOA. To verify the advantages of QGBWOA, comparison experiments between… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 58 publications
(15 citation statements)
references
References 93 publications
(101 reference statements)
0
11
0
Order By: Relevance
“…In contrast to traditional mathematical approaches, these algorithms offer the distinct advantage of not imposing constraints on the objective function, making them particularly effective for tackling multimodal problems. Compared with conventional optimization techniques, MAs have the characteristics of adaptability, simplicity, derivation-free mechanism, and their ability to evade local optima [26]. Consequently, MAs such as the monarch butterfly optimization (MBO) [27], slime mould algorithm (SMA) [28], moth search algorithm (MSA) [29], hunger games search (HGS) [30], Runge Kutta method (RUN) [31], colony predation algorithm (CPA) [32], weighted mean of vectors (INFO) [33], Harris hawks optimization (HHO) [34], and rime optimization algorithm (RIME) [35] have gained significant traction in recent years as a valuable tool for addressing several complex optimization problems including breast cancer prediction [36], melanoma prediction model for imbalanced data [37], medical data feature selection [38].…”
Section: Literature Surveymentioning
confidence: 99%
“…In contrast to traditional mathematical approaches, these algorithms offer the distinct advantage of not imposing constraints on the objective function, making them particularly effective for tackling multimodal problems. Compared with conventional optimization techniques, MAs have the characteristics of adaptability, simplicity, derivation-free mechanism, and their ability to evade local optima [26]. Consequently, MAs such as the monarch butterfly optimization (MBO) [27], slime mould algorithm (SMA) [28], moth search algorithm (MSA) [29], hunger games search (HGS) [30], Runge Kutta method (RUN) [31], colony predation algorithm (CPA) [32], weighted mean of vectors (INFO) [33], Harris hawks optimization (HHO) [34], and rime optimization algorithm (RIME) [35] have gained significant traction in recent years as a valuable tool for addressing several complex optimization problems including breast cancer prediction [36], melanoma prediction model for imbalanced data [37], medical data feature selection [38].…”
Section: Literature Surveymentioning
confidence: 99%
“…37 In their study, 38 The proposed QGBWOA proved its superiority over compared algorithms in feature selection and multithreshold image segmentation by performing several evaluation metrics. 39 In the literature, 40 a multiobjective Quadratic binary HHO (MOQBHHO) technique with KNN method as wrapper classifier is implemented for extracting the optimal feature subsets. The presented methodology is found superior in obtaining the best trade-off between two fitness assessment criteria compared to the other existing multiobjective techniques for recognizing relevant features.…”
Section: Related Workmentioning
confidence: 99%
“…[19][20][21][22] Thankfully, a series of heuristic algorithms with advanced theories have been successively proposed, including monarch butterfly optimization, 23 moth search algorithm, 24 hunger games search (HGS), 25 Runge-Kutta method, 26 colony predation algorithm, 27 weighted mean of vectors, 28 Harris Hawks Optimization (HHO), 29 rime optimization algorithm (RIME), 30 and the sine-cosine algorithm (SCA). 31 In addition, a number of combinatorial algorithms have been recognized, such as BOAALO 32 based on butterfly optimization algorithm, 33 and ant lion optimizer 34 ; CNNA-BES 35 based on convolutional neural network architecture and bald eagle search 36 optimization algorithm; QGBWOA 37 based on quasiopposition-based learning and Gaussian barebone mechanism; MOQBHHO 38 based on K-Nearest Neighbor method and multiobjective HHO. However, according to no free lunch theory, 39 no single algorithm can perform best in all optimization problems.…”
Section: Introductionmentioning
confidence: 99%