2015
DOI: 10.1007/978-3-319-23392-5_5
|View full text |Cite
|
Sign up to set email alerts
|

A Targeted Estimation of Distribution Algorithm Compared to Traditional Methods in Feature Selection

Abstract: The Targeted Estimation of Distribution Algorithm (TEDA) introduces into an EDA/GA hybrid framework a ‘Targeting’ process, whereby the number of active genes, or ‘control points’, in a solution is driven in an optimal direction. For larger feature selection problems with over a thousand features, traditional methods such as forward and backward selection are inefficient. Traditional EAs may perform better but are slow to optimize if a problem is sufficiently noisy that most large solutions are equally ineffect… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 16 publications
0
2
0
Order By: Relevance
“…Pruning the structure of a multilayer perceptron [136] cGA, EcGA, BOA Design of a fully connected multilayer perceptron [137] UMDA Hyperparameters of a convolutional network [ Symbolic regression [157] UMDA Symbolic regression [158] Denoising autoencoder genetic programming Support vector regression [159] UMDA G c Feature subset selection Selective naive Bayes [161] EBNA Selective naive Bayes [162] EBNA Selective naive Bayes [163] cGA, EcGA, BOA Support vector machines [164] TEDA Logistic regression [165] UMDA G c…”
Section: Edas In Supervised Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Pruning the structure of a multilayer perceptron [136] cGA, EcGA, BOA Design of a fully connected multilayer perceptron [137] UMDA Hyperparameters of a convolutional network [ Symbolic regression [157] UMDA Symbolic regression [158] Denoising autoencoder genetic programming Support vector regression [159] UMDA G c Feature subset selection Selective naive Bayes [161] EBNA Selective naive Bayes [162] EBNA Selective naive Bayes [163] cGA, EcGA, BOA Support vector machines [164] TEDA Logistic regression [165] UMDA G c…”
Section: Edas In Supervised Learningmentioning
confidence: 99%
“…The naive Bayes classifier was also used in [162] for an empirical comparison among EBNA, two types of genetic algorithms and two greedy algorithms as sequential feature selection and sequential feature elimination; in [163], three EDAs, namely, cGA, EcGA, and BOA, were experimentally compared. In [164], a hybrid method consisting of a genetic algorithm and a UMDA, named TEDA, was applied to problems with tens of thousands of predictor variables. In [165], the EDA process (UMDA G c ) was embedded in an adapted recursive feature elimination procedure of a logistic regression model.…”
Section: Edas In Supervised Learningmentioning
confidence: 99%