2021
DOI: 10.1007/s00521-021-06406-8
|View full text |Cite|
|
Sign up to set email alerts
|

A systematic review of emerging feature selection optimization methods for optimal text classification: the present state and prospective opportunities

Abstract: Specialized data preparation techniques, ranging from data cleaning, outlier detection, missing value imputation, feature selection (FS), amongst others, are procedures required to get the most out of data and, consequently, get the optimal performance of predictive models for classification tasks. FS is a vital and indispensable technique that enables the model to perform faster, eliminate noisy data, remove redundancy, reduce overfitting, improve precision and increase generalization on testing data. While c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 61 publications
(21 citation statements)
references
References 144 publications
(134 reference statements)
0
19
0
Order By: Relevance
“…Similarly, Rostami et al [203] introduced a comparative analysis of other swarm intelligencebased feature selection methods, considering the strength and weaknesses of this method. Moreover, Abiodun et al [5] conducted an organized review of evolving feature selection methods for text classification optimization tasks. The scope of their work was from 2015 to 2021, reviewing over 200 articles concerning metaheuristic and hyperheuristic procedures.…”
Section: Introductionmentioning
confidence: 99%
“…Similarly, Rostami et al [203] introduced a comparative analysis of other swarm intelligencebased feature selection methods, considering the strength and weaknesses of this method. Moreover, Abiodun et al [5] conducted an organized review of evolving feature selection methods for text classification optimization tasks. The scope of their work was from 2015 to 2021, reviewing over 200 articles concerning metaheuristic and hyperheuristic procedures.…”
Section: Introductionmentioning
confidence: 99%
“… Andrade et al, 2020 , De Mello et al, 2019 , Wang et al, 2010 , Gourinchas et al, 2020 , Anderson et al, 2020 , Kalemli-Ozcan et al, 2020 , Ibn-Mohammed et al, 2021 , Abiodun et al, 2021a , Yadav, 2021 , Ahmed and Tushar, 2020 , Omolara et al, 2018a , Omolara et al, 2022 , Netherlands, 2020 , Omolara et al, 2019a , Esther Omolara et al, 2020 , Oludare et al, 2018a , Oludare et al, 2018b , UNESCO:, 2020 , Midkiff and DaSilva, 2000 , Google, 2020 , Prates et al, 2020 , Suhono et al, 2020 , Omolara et al, 2019b , BBCnews (April, 13, 2020 , Abiodun et al, 2021b , Omolara et al, 2018b , Omolara et al, 2019c , Wilson, 2014 , Masciandaro and (Ed.). , 2017 .…”
Section: Uncited Referencesmentioning
confidence: 99%
“…Subsequently, the VGG16 network [35] explores the relationship between the depth and performance of convolutional neural networks, which has strong extensibility; the Inception network [36] differs from the previous two networks in that it adds a structure called Inception, the main advantage of which is the reduction of training parameters; the ResNet network [37] introduces residual network structure, and it can realize that the accuracy does not decrease with the network deepening. These classic neural networks and other network structure models continue to emerge, resulting in the application of neural networks in various fields, such as natural language processing [38] and information decryption [39,40], which bring great convenience to human life.…”
Section: Plos Onementioning
confidence: 99%