1999
DOI: 10.1016/s0167-8655(99)00083-5
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive floating search methods in feature selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
121
0
8

Year Published

2002
2002
2017
2017

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 268 publications
(133 citation statements)
references
References 4 publications
0
121
0
8
Order By: Relevance
“…Our proposed algorithm has some similarities with adaptive floating search algorithm (Somol et al, 1999). However, our proposed algorithm does not require predefined user specified absolute generalization limit, adaptive calculation of absolute generalization limit, and termination criteria.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Our proposed algorithm has some similarities with adaptive floating search algorithm (Somol et al, 1999). However, our proposed algorithm does not require predefined user specified absolute generalization limit, adaptive calculation of absolute generalization limit, and termination criteria.…”
Section: Resultsmentioning
confidence: 99%
“…However, our proposed algorithm does not require predefined user specified absolute generalization limit, adaptive calculation of absolute generalization limit, and termination criteria. Further, implementation of our proposed algorithm is very straight forward compared to ASFFS (Somol et al, 1999).…”
Section: Resultsmentioning
confidence: 99%
“…Thus, it is often preferable for many high dimensional problems to employ heuristic methods that compromise subset optimality for better computational efficiency. A few examples of such search strategies are sequential search [5,6], floating search [7][8][9], random mutation hill climbing [10] and evolutionary-based approaches [11][12][13][14] .…”
Section: Introductionmentioning
confidence: 99%
“…Under the first strategy, one starts with the full set of features and sequentially eliminates the weakest feature, while bottom-up methods first chooses the single best feature and iteratively add the best out of the remaining features. More advanced methods consider sets rather than single features at a time and combine forward and backward search with each other [27] [32].…”
mentioning
confidence: 99%