2020
DOI: 10.3390/e22080876
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Population Genetic Algorithm for Multilabel Feature Selection Based on Label Complementary Communication

Abstract: Multilabel feature selection is an effective preprocessing step for improving multilabel classification accuracy, because it highlights discriminative features for multiple labels. Recently, multi-population genetic algorithms have gained significant attention with regard to feature selection studies. This is owing to their enhanced search capability when compared to that of traditional genetic algorithms that are based on communication among multiple populations. However, conventional methods employ a simple … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 19 publications
(25 citation statements)
references
References 41 publications
0
22
0
Order By: Relevance
“…A population is divided into multiple small sub-populations who evolve with their own evolution operations; every once in a while, sub-populations interact with each other via merging and communication processes to maintain population diversity and avoid premature convergence [46,47]. The general flow of the multi-population optimization method is shown in Figure 4.…”
Section: The Basics Of the Multi-population Optimization Methodsmentioning
confidence: 99%
“…A population is divided into multiple small sub-populations who evolve with their own evolution operations; every once in a while, sub-populations interact with each other via merging and communication processes to maintain population diversity and avoid premature convergence [46,47]. The general flow of the multi-population optimization method is shown in Figure 4.…”
Section: The Basics Of the Multi-population Optimization Methodsmentioning
confidence: 99%
“…We evaluate our proposed feature selection algorithm RQBSO and compare it with three major groups of feature selection methods, including two filter methods, namely Max-Relevance and Min-Redundancy (mRMR) [58] and ReliefF [59]. Three wrapper methods, namely GA [39], BSO [44], and recursive feature elimination (RFE) [60]. Our method is also compared with two leading embedded methods, namely LASSO [30], and Ridge regression [61].…”
Section: Comparison With Other Feature Selection Algorithmsmentioning
confidence: 99%
“…To address the above issues, various works are proposed to solve feature selection problems using metaheuristics [35]. Most of them use genetic algorithms (GA) [36][37][38][39]. Meta-heuristic algorithms based on swarm intelligence are also applied to feature selection, such as ant colony optimization (ACO) [40,41], particle swarm optimization (PSO) [42,43], and bee swarm optimization (BSO) [44,45].…”
Section: Introductionmentioning
confidence: 99%
“…The optimization algorithms verified their efficiency for improving classification accuracy and reducing the selected features. Samples of these recent implementations are PSO [22], BOA [23], SSA [8], ALO [24], WOA [21,25], GOA [26], and GA [27]. Despite the unique construction of each optimization algorithm, there are some shared characteristics: initializing a random population (solutions) as the opening process, evaluating the solutions on each iteration based on the fitness function, updating the solution, and determining the best solution based on a termination term.…”
Section: Related Workmentioning
confidence: 99%