2006
DOI: 10.1016/j.eswa.2005.09.024
|View full text |Cite
|
Sign up to set email alerts
|

A GA-based feature selection and parameters optimizationfor support vector machines

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
587
0
12

Year Published

2008
2008
2019
2019

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 1,242 publications
(602 citation statements)
references
References 11 publications
3
587
0
12
Order By: Relevance
“…In addition to the commonly used GS, other techniques were also employed in SVR (or SVM) to correct appropriate values of hyper-parameters. Huang and Wang (2006) presented a GA-based feature selection and parameters' optimization for SVM. Also, Momma and Bennett (2002) developed a fully automated pattern search (PS) methodology for model selection of SVR.…”
Section: Development Of Modelmentioning
confidence: 99%
“…In addition to the commonly used GS, other techniques were also employed in SVR (or SVM) to correct appropriate values of hyper-parameters. Huang and Wang (2006) presented a GA-based feature selection and parameters' optimization for SVM. Also, Momma and Bennett (2002) developed a fully automated pattern search (PS) methodology for model selection of SVR.…”
Section: Development Of Modelmentioning
confidence: 99%
“…Two methods were considered in this study: genetic algorithms (GA) and sequential forward selection (SFS). A GA approach (Huang and Wang, 2006) for feature selection is here used as an alternative to the conventional heuristic method SFS (Theodoridis and Koutroumbas, 2008). In the case of SFS method, the classification accuracy is used as selection criteria, thus, an exhaustive analysis was conducted.…”
Section: Feature Selectionmentioning
confidence: 99%
“…Eng. 2017 September;33(3): 202-217 scatter and maximizes the between-cluster separation, such as proposed by Huang and Wang (2006). Both methods are carried out following two approaches: in the first one, features from the four channels are considered as a whole dataset using all-channel analysis (GA and SFS).…”
Section: Feature Selectionmentioning
confidence: 99%
“…1. The algorithm starts by generating a random population of individuals, binary encoded with dimension equal to the initial number of features, each chromosome representing if a feature is used in the subset represented by the individual [19]. The individuals are evaluated using the ANN modeling method and depending on the performance of the individuals, they are selected for the creation of the next generation.…”
Section: Genetic Algorithmmentioning
confidence: 99%