2022
DOI: 10.32604/cmc.2022.019611
|View full text |Cite
|
Sign up to set email alerts
|

BHGSO: Binary Hunger Games Search Optimization Algorithm for Feature Selection Problem

Abstract: In machine learning and data mining, feature selection (FS) is a traditional and complicated optimization problem. Since the run time increases exponentially, FS is treated as an NP-hard problem. The researcher's effort to build a new FS solution was inspired by the ongoing need for an efficient FS framework and the success rates of swarming outcomes in different optimization scenarios. This paper presents two binary variants of a Hunger Games Search Optimization (HGSO) algorithm based on V-and S-shaped transf… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 35 publications
(9 citation statements)
references
References 36 publications
0
7
0
Order By: Relevance
“…As detailed in the numerical experiments (Section 5), we establish that performance of the GSO, based on the adaptively decreasing step size strategy (9), for solving AVEs is good. We call this improved GSO with adaptive variable step size model (10) SIGGSO, which comprises models (3), ( 4), (10), and (7).…”
Section: Initialization Related Parametersmentioning
confidence: 80%
See 1 more Smart Citation
“…As detailed in the numerical experiments (Section 5), we establish that performance of the GSO, based on the adaptively decreasing step size strategy (9), for solving AVEs is good. We call this improved GSO with adaptive variable step size model (10) SIGGSO, which comprises models (3), ( 4), (10), and (7).…”
Section: Initialization Related Parametersmentioning
confidence: 80%
“…The proposed CMBO can outperform the standard MBO and other eight state-of-the-art canonical algorithms. [9] presents two binary variants of a Hunger Games Search Optimization (HGSO) algorithm based on V-and Sshaped transfer functions (BHGSO-V and BHGSO-S) within a wrapper FS model for choosing the best features from a large dataset. The experimental results demonstrate that the BHGSO-V algorithm can reduce dimensionality and choose the most helpful features for classification problems.…”
Section: Introductionmentioning
confidence: 99%
“…To get around problems with hypotheses and difficulty with the convergence of traditional iterative procedures, metaheuristic algorithms have been reported to identify the parameters of the PV cell/module 15,16,34‐36 . In general, several metaheuristic techniques, including Genetic Algorithm (GA), 37 Differential Evolutionary (DE) algorithm, 38 Artificial Bee Colony Algorithm, 39 ant colony optimization, 40 Particle Swarm Optimization (PSO), 12 Bat Algorithm, 41 Cuckoo Search Optimization Algorithm, 42 Bacterial Foraging Algorithm, 43 Pattern Search Algorithm (PSA), 44 Tabu Search Algorithm, 45 Harmony Search Algorithm, 46 Symbiotic Organisms Search (SOS) algorithm, 47 Sunflower Optimization Algorithm, 48 Gray Wolf Optimizer (GWO), 49‐51 hybrid GWO, 52 Salp Swarm Algorithm (SSA), 53‐55 Whale Optimization Algorithm, 56,57 Dragonfly Algorithm, 58 Firefly Optimization algorithm, 59 Fireworks Algorithm (FA), 60 Moth‐Flame Optimization (MFO) algorithm, 61 Multiverse Optimizer, 62 Sine‐Cosine Algorithm (SCA), 63 Ant Lion Optimizer (ALO), 64 Cat Swarm Algorithm (CSA), 65 JAYA algorithm, 66,67 Harris Hawk Optimizer (HHO), 31 Coyote Optimization Algorithm (COA), 68 Slime Mold Algorithm (SMA), 69‐71 RAO algorithm, 72,73 Atom Search Optimizer, 74 Manta Ray Foraging Algorithm, 32 Equilibrium Optimizer, 75,76 Spotted Hyena Algorithm, 77 Marine Predator Algorithm (MPA), 78 Arithmetic Optimization Algorithm (AOA), 79 Gradient‐Based Optimizer (GBO), 80‐82 Hunger Games Search Optimizer (HGSO), 83,84 Runge–Kutta Optimization Algorithm (RKOA), 85 thermal exchange optimizer, 86 Honey Badger Optimizer (HBO), 87 Jumping Spider Optimizer (...…”
Section: Introductionmentioning
confidence: 99%
“…Even though metaheuristic approaches are more reliable and efficient at finding solutions, their effectiveness is highly reliant on the appropriate selection of control variables 12 . Few instance of metaheuristic methods, including, Differential Evolutionary (DE) algorithms and its variants, 22,23 Genetic Algorithm (GA) and its variants, 24,25 Particle Swarm Optimizer (PSO) and its variants, 26,27 Artificial Bee Colony (ABC), 28 Ant Colony Optimization (ACO), 29 Water Cycle Algorithm (WCA), 30 Cuckoo Search Algorithm (CSA) and its variants, 31,32 Grey Wolf Optimizer (GWO), 33 Whale Optimizer (WO), 34 Firefly Optimizer (FFO), 35 Flower Pollination Algorithm (FPA), 36 Wind Driven Optimization (WDO), 37 Crow Search Algorithm (CrSA), 38 Jaya algorithm and its variants, 39,40 Shuffled Frog Leaping Algorithm (SFLA), 41 Symbiotic Organisms Search (SOS), 42 Salp Swarm Algorithm (SSA), 43–45 Emperor Penguin Algorithm (EPA), 46 Spotted Hyena Algorithm (SHA), 47 Ant Lion Optimizer (ALO), 48 Marine Predator Algorithm (MPA), 49,50 Equilibrium Optimizer (EO), 51,52 Teaching‐Learning‐Based Optimization (TLBO) algorithm, 53 Fireworks Algorithm (FA), 54 Slime Mould Optimization (SMA), 55,56 Runge–Kutta Optimizer (RKO), 57 Hunger Games Search Optimization Algorithm (HGSO), 14,58 Gradient‐Based Optimizer (GBO), 59–61 Tuna Swarm Optimizer (TSO), 62 Atom Search Optimizer (ASO), 63 Arithmetic Optimization Algorithm (AOA), 64 Jumping Spider Algorithm (JSA), 65 Plasma Generation Optimization (PGO), 66 Generalized Normal Distribution Optimization (GNDO) algorithm, 67 African Vulture Algorithm (AVA), 68 Thermal Exchange Optimization (TEO), 69 Turbulent Water Flow Optimization Algorithm (TWFOA), 70 etc. and improvement techniques, such as Nelder–Mead simplex methods, 71 Levy flight mechanism, 72 Brownian random w...…”
Section: Introductionmentioning
confidence: 99%