2013
DOI: 10.1016/j.eswa.2013.03.032
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection for face recognition based on multi-objective evolutionary wrappers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
32
0
1

Year Published

2015
2015
2022
2022

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 83 publications
(36 citation statements)
references
References 38 publications
0
32
0
1
Order By: Relevance
“…They need to train a predictor (classifier or regression model) to evaluate each feature subset; therefore, they are more precise but slower than the filters. The main challenges in wrapper methods are selecting a proper predictor (Li et al 2009;Maldonado and Weber 2009;Monirul Kabir, Monirul Islam, and Murase 2010;Sánchez-Maroño and Alonso-Betanzos 2011) and how to generate appropriate subsets (Macas et al 2012;Tay and Cao 2001;Vignolo, Milone, and Scharcanski 2013). Finally, hybrid methods use both filter and wrapper evaluators simultaneously (Bermejo et al 2012;Bermejo, Gámez, and Puerta 2011;Gheyas and Smith 2010;Ruiz et al 2012).…”
Section: Feature Selectionmentioning
confidence: 99%
“…They need to train a predictor (classifier or regression model) to evaluate each feature subset; therefore, they are more precise but slower than the filters. The main challenges in wrapper methods are selecting a proper predictor (Li et al 2009;Maldonado and Weber 2009;Monirul Kabir, Monirul Islam, and Murase 2010;Sánchez-Maroño and Alonso-Betanzos 2011) and how to generate appropriate subsets (Macas et al 2012;Tay and Cao 2001;Vignolo, Milone, and Scharcanski 2013). Finally, hybrid methods use both filter and wrapper evaluators simultaneously (Bermejo et al 2012;Bermejo, Gámez, and Puerta 2011;Gheyas and Smith 2010;Ruiz et al 2012).…”
Section: Feature Selectionmentioning
confidence: 99%
“…The goal of the feature selection methods is to seek the relevant features with the most predictive information from the original feature set. Feature selection has been established as an important technique in many practical applications such as text processing (Aghdam et al, 2009;Shamsinejadbabki and Saraee, 2011;Uğuz, 2011), face recognition (Chakraborti and Chatterjee, 2014;Kanan and Faez, 2008;Vignolo et al, 2013), image retrieval (da Silva et al, 2011;Rashedi et al, 2013), medical diagnosis (Inbarani et al, 2014), case-base reasoning (Zhu et al, 2015), collaborative filtering basd recommender systems (Ramezani et al, 2013) and bioinformatics (Jaganathan and Kuppuchamy, 2013).…”
Section: Introductionmentioning
confidence: 99%
“…The experiments were performed using University of ESSEX face Recognition Data. In another work, Vignolo et al (2013) also used ASM for extracting features, but with a modi ed version of GA called Multi-Objective Genetic Algorithm (MOGA), in which the rank of an individual is the number of chromosomes in the population by which it dominates. Also, the author proposed an aggregative tness function, which combines classi cation accuracy and the number of features in a single equation.…”
Section: Genetic Algorithmmentioning
confidence: 99%