2021
DOI: 10.1109/access.2021.3117853
|View full text |Cite
|
Sign up to set email alerts
|

An Improved Binary Grey-Wolf Optimizer With Simulated Annealing for Feature Selection

Abstract: This paper proposes improvements to the binary grey-wolf optimizer (BGWO) to solve the feature selection (FS) problem associated with high data dimensionality, irrelevant, noisy, and redundant data that will then allow machine learning algorithms to attain better classification/clustering accuracy in less training time. We propose three variants of BGWO in addition to the standard variant, applying different transfer functions to tackle the FS problem. Because BGWO generates continuous values and FS needs disc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(5 citation statements)
references
References 110 publications
0
5
0
Order By: Relevance
“…In the work of Hu et al [34], an enhanced variant of the binary GWO was introduced, incorporating a novel strategy for updating the parameter governing exploration and exploitation, along with fve transfer functions for mapping continuous values to their binary counterparts, thereby enhancing the quality of candidate solutions. Likewise, Abdel-Basset et al [35] proposed three distinct binary GWO variants, each utilizing diferent transfer functions. In addition to GWO-based research, PSO has been a focus in prior studies.…”
Section: Metaheuristic-based Fsmentioning
confidence: 99%
“…In the work of Hu et al [34], an enhanced variant of the binary GWO was introduced, incorporating a novel strategy for updating the parameter governing exploration and exploitation, along with fve transfer functions for mapping continuous values to their binary counterparts, thereby enhancing the quality of candidate solutions. Likewise, Abdel-Basset et al [35] proposed three distinct binary GWO variants, each utilizing diferent transfer functions. In addition to GWO-based research, PSO has been a focus in prior studies.…”
Section: Metaheuristic-based Fsmentioning
confidence: 99%
“…Ouadfel et al [31] proposed a hybrid feature selection approach based on the ReliefF filter method and equilibrium optimizer (EO), which is composed of two phases and tested in some open datasets. Abdel-Basset et al [14] proposed three variants of BGWO in addition to the standard variant, applying different transfer functions to tackle the feature selection problem. In [32], two different wrapper feature selection approaches were proposed based on farmland fertility algorithm (FFA), which denoted as BFFAS and BFFAG, and these methods are effective in solving feature selection problems.…”
Section: Swarm Intelligence-based Feature Selectionmentioning
confidence: 99%
“…Thus, it can be seen as an effective method to overcome the challenges of feature selection. For instance, some well-known swarm intelligence, i.e., genetic algorithm (GA) [10], particle swarm optimization (PSO) [11], dragonfly algorithm (DA) [12], ant-lion optimizer (ALO) [13], and grey wolf optimizer (GWO) [14] has been applied in feature selection.…”
Section: Introductionmentioning
confidence: 99%
“…The FB methods classify features based on their inherent characteristics, e.g., dependency, information consistency, rank, etc., without using ML methods [27]. Although FB methods are faster and have a lower computational cost than WB methods because they do not call ML tools, they typically cannot offer satisfactory solutions [43,44]. This is because the selected features do not significantly affect the performance of ML methods [1,8,45] .…”
Section: Introductionmentioning
confidence: 99%