Pattern Recognition Recent Advances 2010
DOI: 10.5772/9356
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Feature Subset Selection and Subset Size Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
35
0
1

Year Published

2013
2013
2021
2021

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 49 publications
(38 citation statements)
references
References 44 publications
0
35
0
1
Order By: Relevance
“…In the following, we will briefly present and evaluate this algorithm, along with another, general-purpose feature selection method called Sequential Forward Floating Selection (SFFS) (Somol et al, 2010). Then we will manually select a small set of filters, where the manual selection is guided by some simple heuristics.…”
Section: Gabor Filtersmentioning
confidence: 99%
See 2 more Smart Citations
“…In the following, we will briefly present and evaluate this algorithm, along with another, general-purpose feature selection method called Sequential Forward Floating Selection (SFFS) (Somol et al, 2010). Then we will manually select a small set of filters, where the manual selection is guided by some simple heuristics.…”
Section: Gabor Filtersmentioning
confidence: 99%
“…In the following, we will briefly introduce these algorithms. As a detailed discussion of these methods is beyond the scope of our paper, we kindly refer the reader to Gramms (1991) and Somol et al (2010). We selected the size and overlap of our filters based on the literature and also on experimental findings.…”
Section: Automatic Feature Selection Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…Nevertheless, a search does not necessarily need to be exhaustive in order for it to be optimal as demonstrated in branch and bound method and best first search approaches. However, all optimal methods can be expected to be considerably slow for high dimensional problems [3]. Thus, it is often preferable for many high dimensional problems to employ heuristic methods that compromise subset optimality for better computational efficiency.…”
Section: Introductionmentioning
confidence: 99%
“…Although the new features in the new reduced dimensional space are related to the original features, the actual interpretation of the original features and hence the relation to the original system variables is completely lost in most cases. This drawback should be taken into account when considering dimensionality reduction since the actual interpretation may be important to understand the learning process that generates the new feature space [3]. Feature extraction also often associated with computational inefficiency despite the fact that it may significantly reduce dimensional space since the new constructed features are based on transformation that involves all original features including irrelevant and redundant features.…”
Section: Introductionmentioning
confidence: 99%