DOI: 10.1007/978-3-540-71618-1_27
|View full text |Cite
|
Sign up to set email alerts
|

Multi-objective Feature Selection with NSGA II

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
61
0

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 125 publications
(66 citation statements)
references
References 14 publications
1
61
0
Order By: Relevance
“…50 This method is designed to select the more informative features for image quality from all aesthetic features. The goal is to find an optimal subset of features by minimizing the number of used features and quality prediction error.…”
Section: Review On Related Measuresmentioning
confidence: 99%
See 1 more Smart Citation
“…50 This method is designed to select the more informative features for image quality from all aesthetic features. The goal is to find an optimal subset of features by minimizing the number of used features and quality prediction error.…”
Section: Review On Related Measuresmentioning
confidence: 99%
“…Then, we utilized the NSGA-II feature selection method 50 to select the more informative features for image quality from the augmented feature vector. This method is designed to find an optimal subset of features by minimization of both the number of used features and the classification error using the 1-NN classifier.…”
Section: Optimal Feature Selection Analysismentioning
confidence: 99%
“…They define two objectives, where the first one maximizes the separation of clusters in the feature space, while the second one minimizes the number of selected features. Similarly, Hamdani et al use NSGA-II to minimize classification error of a nearest neighbor (1-NN) classifier as well as the number of features [9]. The same approach is used by Tekgüç et al in the context of facial expression recognition [23].…”
Section: Related Workmentioning
confidence: 99%
“…Merelo et al {8] applied a MOEA to take into account individually the errors of type I (false positive) and type II (false negative). Hamdani et al [9] used the NSGA-II [10] algorithm to optimize simultaneously the number of features and the global error obtained by a neural network classifier.…”
Section: Introductionmentioning
confidence: 99%