2020
DOI: 10.1109/access.2020.3007291
|View full text |Cite
|
Sign up to set email alerts
|

Approaches to Multi-Objective Feature Selection: A Systematic Literature Review

Abstract: Feature selection has gained much consideration from scholars working in the domain of machine learning and data mining in recent years. Feature selection is a popular problem in Machine learning with the goal of finding optimal features with increase accuracy. As a result, several studies have been conducted on multi-objective feature selection through numerous multi-objective techniques and algorithms. The objective of this paper is to present a systematic literature review of the challenges and issues of th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
40
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 104 publications
(41 citation statements)
references
References 90 publications
0
40
0
1
Order By: Relevance
“…To capture the essential but hidden variables for the significant insight is an endeavor task, but it is the best choice for best decisions and predictions. Most of the data scientists [30], [31] are trying to find the best methods to achieve the best features insights. In this research, the WA is used to select the feature sets (named as premier feature sets) across the given dataset.…”
Section: Proposed Methods -Gawamentioning
confidence: 99%
See 1 more Smart Citation
“…To capture the essential but hidden variables for the significant insight is an endeavor task, but it is the best choice for best decisions and predictions. Most of the data scientists [30], [31] are trying to find the best methods to achieve the best features insights. In this research, the WA is used to select the feature sets (named as premier feature sets) across the given dataset.…”
Section: Proposed Methods -Gawamentioning
confidence: 99%
“…Feature selection is deliberated as a vital part of many machine learning methods that remarkably affect the model accuracy. Generally, it has two contractional aims, maximizing the classification's performance and minimizing the feature size [30], [31]. Several feature selection techniques have been introduced, like Mutual Information (MI), Term Frequency -Inverse Document Frequency (TF-IDF), Information Gain (IG), and Chi-Square (CS) for the sentiment classification with a different machine learning algorithm.…”
Section: Related Workmentioning
confidence: 99%
“…Carefully understanding the dataset along with dimensionality reduction issues before any data analysis process are crucial to the success of the analysis itself [32]. Data preprocessing involves transforming raw data into a format that is suitable for processing.…”
Section: Feature Selection Algorithmmentioning
confidence: 99%
“…This can be done using a heuristic strategy which performs a guided search over the entire solution space to find a reasonably good feature subset which may not be the optimal solution but is acceptable within computational constraints. Higher-level heuristics or meta-heuristics have become quite popular in recent years to solve FS problems in different fields [8] such as handwriting recognition [9], benchmark problems [10], gene selection [11], medical diagnosis [12], financial problems, network intrusion and security .…”
Section: A Feature Selectionmentioning
confidence: 99%
“…of f spring1 = r of * male + (1 − r of ) * f emale (8) of f spring2 = r of * f emale + (1 − r of ) * male (9)…”
Section: Some Preliminaries a Mayfly Optimization Algorithmmentioning
confidence: 99%