2022
DOI: 10.1007/s00521-022-07705-4
|View full text |Cite|
|
Sign up to set email alerts
|

Multiclass feature selection with metaheuristic optimization algorithms: a review

Abstract: Selecting relevant feature subsets is vital in machine learning, and multiclass feature selection is harder to perform since most classifications are binary. The feature selection problem aims at reducing the feature set dimension while maintaining the performance model accuracy. Datasets can be classified using various methods. Nevertheless, metaheuristic algorithms attract substantial attention to solving different problems in optimization. For this reason, this paper presents a systematic survey of literatu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 82 publications
(39 citation statements)
references
References 285 publications
0
24
0
Order By: Relevance
“…All tasks were binary classification problems, preceded by feature selection. While there are several feature selection methods that can be used including metaheuristic algorithms [ 44 ], since the primary goal of the study is to showcase the utility of SHAP, a simplified machine learning workflow was used ( Fig 2 ).…”
Section: Resultsmentioning
confidence: 99%
“…All tasks were binary classification problems, preceded by feature selection. While there are several feature selection methods that can be used including metaheuristic algorithms [ 44 ], since the primary goal of the study is to showcase the utility of SHAP, a simplified machine learning workflow was used ( Fig 2 ).…”
Section: Resultsmentioning
confidence: 99%
“…Many proposed metaheuristic algorithms have provided optimal or near-optimal solutions to many real-world applications, including various feature selection problems [ 24 ]. Some of which include the Whale Optimization Algorithm (WOA) and its hybrid [ 16 , 25 , 26 ], Cuckoo Search Optimization Algorithm (CSO) [ 27 – 29 ], DragonFly Algorithm (DA) (Chantar et al, 2021; Cui et al, 2020; Sree Ranjini & Murugan, 2017) [ 30 – 32 ], Prairie Dog Optimization (PDO) Algorithm [ 33 ] and many more.…”
Section: Related Literaturementioning
confidence: 99%
“…Quantum feature selection Feature selection is also a dimension reduction algorithm [34]. Its core is to select the feature subset that best represents the overall data information through loss function and sparse regular term.…”
Section: Quantum Dimensionality Reductionmentioning
confidence: 99%