2023
DOI: 10.3390/app13020906
|View full text |Cite
|
Sign up to set email alerts
|

Analyzing Physics-Inspired Metaheuristic Algorithms in Feature Selection with K-Nearest-Neighbor

Abstract: In recent years, feature selection has emerged as a major challenge in machine learning. In this paper, considering the promising performance of metaheuristics on different types of applications, six physics-inspired metaphor algorithms are employed for this problem. To evaluate the capability of dimensionality reduction in these algorithms, six diverse-natured datasets are used. The performance is compared in terms of the average number of features selected (AFS), accuracy, fitness, convergence capabilities, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 31 publications
(12 citation statements)
references
References 33 publications
0
6
0
Order By: Relevance
“…The findings determined that kNN can classify with 96.83% when Ant Colony Optimization is used as the feature selection method. Another study aiming to classify breast cancer dataset with kNN and using feature selection was proposed by Priyadarshini et al in [38]. The authors applied various feature selection methods to various datasets, including a breast cancer dataset, and then classified them with kNN.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The findings determined that kNN can classify with 96.83% when Ant Colony Optimization is used as the feature selection method. Another study aiming to classify breast cancer dataset with kNN and using feature selection was proposed by Priyadarshini et al in [38]. The authors applied various feature selection methods to various datasets, including a breast cancer dataset, and then classified them with kNN.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The performance of these algorithms were compared using factors such as accuracy, processing cost, suitability, average of selected features and convergence capabilities. The results showed that Equilibrium Optimizer (EO) had a better performance than other algorithms and it was suggested to solve problems related to feature selection 29 .…”
Section: Related Literaturementioning
confidence: 99%
“…In this paper, we integrate the information feedback model (IFM) [13] into the original Sine-Cosine Algorithm framework [31,32] to tackle many-objective optimization problems. The model's underlying concept involves using information from prior iterations of individuals to influence offspring generation.…”
Section: Related Workmentioning
confidence: 99%