2017
DOI: 10.5614/itbj.ict.res.appl.2017.11.3.6
|View full text |Cite
|
Sign up to set email alerts
|

Improving Floating Search Feature Selection using Genetic Algorithm

Abstract: Abstract. Classification, a process for predicting the class of a given input data, is one of the most fundamental tasks in data mining. Classification performance is negatively affected by noisy data and therefore selecting features relevant to the problem is a critical step in classification, especially when applied to large datasets. In this article, a novel filter-based floating search technique for feature selection to select an optimal set of features for classification purposes is proposed. A genetic al… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 24 publications
(23 reference statements)
0
7
0
Order By: Relevance
“…The accuracy of imputation with the hybrid method increased by 1%, showing better performance than the three other imputation models and without imputation. Using GA classifiers such as NB, k-NN and NN on the dataset repository produced significant differences in classification accuracy [22]. The hybrid method showed better classification accuracy with large-dimension datasets, with more than 90% classification accuracy at missing rate below 30%.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The accuracy of imputation with the hybrid method increased by 1%, showing better performance than the three other imputation models and without imputation. Using GA classifiers such as NB, k-NN and NN on the dataset repository produced significant differences in classification accuracy [22]. The hybrid method showed better classification accuracy with large-dimension datasets, with more than 90% classification accuracy at missing rate below 30%.…”
Section: Resultsmentioning
confidence: 99%
“…However, NB has very sensitive weaknesses in the selection of features, so it weighting and independent variable selection are required to improve model accuracy [21]. The genetic algorithm is an iterative method to get a global optimum for the selection of the features to be used as input for the naïve Bayes process [22][23][24][25][26][27][28]. This research used SOMI combined with feature selection by an evolutionary algorithm and NBC.…”
Section: Introductionmentioning
confidence: 99%
“…Instead, we propose a simpler approach, namely, floating search, which has obtained successful results in other research fields, such as feature selection [13][14][15][16].…”
Section: Methodsmentioning
confidence: 99%
“…A recent study from Homsapaya & Sornil in [6] introduced a floating search technique employing a genetic algorithm (GA) to improve the quality of the selected feature subset. The results showed that GA improved the performance for the majority of sample datasets.…”
Section: Feature Selectionmentioning
confidence: 99%