AI 2007: Advances in Artificial Intelligence
DOI: 10.1007/978-3-540-76928-6_18
|View full text |Cite
|
Sign up to set email alerts
|

Feature Construction and Dimension Reduction Using Genetic Programming

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 37 publications
(23 citation statements)
references
References 9 publications
0
23
0
Order By: Relevance
“…Although augmented feature sets (datasets) are supersets of constructed features, previous research has shown that in majority of cases, the classification performance using only constructed features is higher than those using the augmented sets [63]. Besides, due to the larger number of features in the augmented datasets, there is a slight increase in dimensionality.…”
Section: A Existing Issuesmentioning
confidence: 97%
See 1 more Smart Citation
“…Although augmented feature sets (datasets) are supersets of constructed features, previous research has shown that in majority of cases, the classification performance using only constructed features is higher than those using the augmented sets [63]. Besides, due to the larger number of features in the augmented datasets, there is a slight increase in dimensionality.…”
Section: A Existing Issuesmentioning
confidence: 97%
“…If the distribution of a class c along a feature X is normal, the interval I c = (μ c − 3σ c , μ c + 3σ c ) can cover 99% of instances of the class where μ c = E(X|C = c) is the mean and σ c = E(X 2 |C = c) − E 2 (X|C = c) is the standard deviation of the class distribution. In [63], it is assumed that the conditional density of a (constructed) feature is normal. The assumption is made without actually implementing a test for goodness of fit as it is infeasible to have such a test in each fitness evaluation.…”
Section: Finding the Interval Of A Classmentioning
confidence: 99%
“…Variable selection based on genetic programming has been exploited in various applications where the significant inputs are generally unknown (for examples see [15,8,7,11,9,3,2]). …”
Section: Variable Selectionmentioning
confidence: 99%
“…One of the unique capabilities of genetic programming is its built-in power to select significant variables and gradually omit the variables that are not relevant while evolving models. Variable selection based on genetic programming has been exploited in various applications where the significant inputs are generally unknown (for examples see [36,15,21,23,26,36]). …”
Section: Variable Importancementioning
confidence: 99%