2015
DOI: 10.1016/j.procs.2015.07.387
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection Using K-Means Genetic Algorithm for Multi-objective Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
17
0
1

Year Published

2016
2016
2022
2022

Publication Types

Select...
7
2
1

Relationship

1
9

Authors

Journals

citations
Cited by 39 publications
(19 citation statements)
references
References 7 publications
0
17
0
1
Order By: Relevance
“…It is noted that the algorithm could produce minimum index value for the maximum datasets. However, there is a need for proper feature selection for better, more optimal solution [14,15]. Ruby et al [16] suggested two methods for ranking of MOPs.…”
Section: Literature Reviewmentioning
confidence: 99%
“…It is noted that the algorithm could produce minimum index value for the maximum datasets. However, there is a need for proper feature selection for better, more optimal solution [14,15]. Ruby et al [16] suggested two methods for ranking of MOPs.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Recently developed nature-inspired optimization algorithms are the best approaches for finding global solutions for combinatorial optimization problems like microarray datasets. Different genetic algorithm (GA) methods have been proposed to tackle the feature selection [5,6]. A hybrid algorithm was proposed using GA with the artificial neural networks (ANN) where GA was used as a pre-step to reduce the feature size [7].…”
Section: Introductionmentioning
confidence: 99%
“…A disadvantage of K-Means is that it is easy to fall into local optima. As a remedy, a popular trend is to integrate the genetic algorithm [7,8] with K-means to obtain genetic K-means algorithms [9][10][11][12][13][14][15][16][17][18][19][20][21][22][23]. K-Means is also combined with fuzzy mechanism to obtain fuzzy C-Means [24,25].…”
Section: Introductionmentioning
confidence: 99%