2020
DOI: 10.1007/s00521-020-05210-0
|View full text |Cite
|
Sign up to set email alerts
|

The monarch butterfly optimization algorithm for solving feature selection problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
44
0
2

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 91 publications
(46 citation statements)
references
References 66 publications
0
44
0
2
Order By: Relevance
“…The proposed FS method is not tested on datasets that consist of more than 400 features. Alweshah et al [83] introduced a binary monarch butterfly optimization algorithm (bMBO) for selecting optimal features from high dimensional datasets. The proposed bMBO could show inconsistent results on gigabyte-size datasets.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The proposed FS method is not tested on datasets that consist of more than 400 features. Alweshah et al [83] introduced a binary monarch butterfly optimization algorithm (bMBO) for selecting optimal features from high dimensional datasets. The proposed bMBO could show inconsistent results on gigabyte-size datasets.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Nowadays, FS is an essential step to preprocess high-dimensional datasets. It must be pointed that there are representative computational intelligence algorithms that have been applied to improve the FS in different studies such as [7], [9], [33], [34], [27], [46], and [47]. The optimization methods aim to obtain the optimal solution for FS (i.e., significant feature subset) within an appropriate time and cost.…”
Section: Related Workmentioning
confidence: 99%
“…When all possible subsets of the dataset are removed during the generation process, there is very high complexity and a high processing time of x 2 , where x is the number of characteristics in the dataset [23]. Therefore, physicists have sought to formulate approaches to solve the feature selection (FS) problem and provide solutions more efficiently than conventional techniques.…”
Section: Introductionmentioning
confidence: 99%
“…It has been shown that they can help minimize execution time and produce specific outcomes. Grey Wolf Optimizer (GWO) [31], Whale Optimization Algorithm [32], Monarch Butterfly Algorithm [23] Coyote Optimization Algorithm [33], Genetic Algorithm [34], Krill Herd Algorithm [35] Harmony Search [36], Aquila Optimizer [37], Particle Swarm Algorithm [38], and Parallel Membrane-inspired Framework [39] are examples of the metaheuristics that have been used to address feature selection problems.…”
Section: Introductionmentioning
confidence: 99%