2022
DOI: 10.7494/csci.2022.23.2.4204
|View full text |Cite
|
Sign up to set email alerts
|

An Assessment of Nature-Inspired Algorithms for Text Feature Selection

Abstract: This paper provides a comprehensive assessment of feature selection (FS) methods that are originated from nature-inspired (NI) meta-heuristics, where two well-known filter-based FS methods are also included for comparison. The performances of the considered methods are compared on two different high-dimensional and real-world text datasets against the accuracy, the number of selected features, and computation time. This study differs from existing studies in terms of the extent of experimental analyses perform… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(9 citation statements)
references
References 35 publications
0
8
0
Order By: Relevance
“…One of the most popular ways to achieve dimension reduction is to apply FS in which different methods can be used to select the most informative features. Considering the evaluation criteria, the methods used in the FS step can be categorized into three types namely, filters, wrappers (e.g., nature‐inspired meta‐heuristics), and embedded 13,45 . In this paper, we use the filter‐based approach and employ the most popular 13 methods including chi‐square (CHI or χ2$$ {\chi}^2 $$), mutual information (MI), and one‐way ANOVA function (AF).…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…One of the most popular ways to achieve dimension reduction is to apply FS in which different methods can be used to select the most informative features. Considering the evaluation criteria, the methods used in the FS step can be categorized into three types namely, filters, wrappers (e.g., nature‐inspired meta‐heuristics), and embedded 13,45 . In this paper, we use the filter‐based approach and employ the most popular 13 methods including chi‐square (CHI or χ2$$ {\chi}^2 $$), mutual information (MI), and one‐way ANOVA function (AF).…”
Section: Methodsmentioning
confidence: 99%
“…From the FS perspective, CHI measures how a feature is independent of a class (i.e., label), while MI takes its maximum value if there exists a strong dependence between a feature and a class. 13,23,46 These two methods have quite a similar definition and the goodness of a feature f with respect to a class c is measured as follows: 7,13,23,46…”
Section: Supervised MLmentioning
confidence: 99%
See 3 more Smart Citations