2023
DOI: 10.17485/ijst/v16i10.2102
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection Techniques in Learning Algorithms to Predict Truthful Data

Abstract: Objectives: This review focuses on various feature selection process, strategy, and methods such as filter, wrapper and embedded algorithms and its advantages and disadvantages are presented. Methods: The algorithms such as Mutual Information Gain (MIG), Chi-Square (CS) and Recursive Feature Elimination (RFE) are used to select features. In this review, two benchmark datasets: Breast cancer and Diabetes are used. Findings: To improve the efficiency, selection of appropriate feature selection methods and algori… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 22 publications
0
0
0
Order By: Relevance
“…In this study, we propose a hybrid dimensionality reduction module that harnesses the power of feature selection and feature extraction approaches to reduce the input feature space since unuseful features increase computational complexity and decrease accuracy. Usha and Anuradha [18] compared three feature selection techniques: mutual information gain, the chai-square method, and recursive feature elimination. They reported that recursive feature selection selects optimal features compared to other techniques.…”
Section: Introductionmentioning
confidence: 99%
“…In this study, we propose a hybrid dimensionality reduction module that harnesses the power of feature selection and feature extraction approaches to reduce the input feature space since unuseful features increase computational complexity and decrease accuracy. Usha and Anuradha [18] compared three feature selection techniques: mutual information gain, the chai-square method, and recursive feature elimination. They reported that recursive feature selection selects optimal features compared to other techniques.…”
Section: Introductionmentioning
confidence: 99%