1994
DOI: 10.1016/0167-8655(94)90127-9
|View full text |Cite
|
Sign up to set email alerts
|

Floating search methods in feature selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

8
1,387
0
43

Year Published

1998
1998
2015
2015

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 2,687 publications
(1,508 citation statements)
references
References 9 publications
8
1,387
0
43
Order By: Relevance
“…not using X, Y coordinates and first-and second-order derivatives of them). These correspond to the following time functions reported in Table I Due to the the low amount of available training data in a signature real case, Sequential Forward Feature Selection (SFFS) algorithm [20] is performed in order to obtain a subset of time functions for each system considered in this work improving the performance in terms of EER (%). This technique offers a suboptimal solution since it does not take into account all the possible feature combinations, although it considers correlations between features.…”
Section: B Feature Extraction and Selectionmentioning
confidence: 99%
“…not using X, Y coordinates and first-and second-order derivatives of them). These correspond to the following time functions reported in Table I Due to the the low amount of available training data in a signature real case, Sequential Forward Feature Selection (SFFS) algorithm [20] is performed in order to obtain a subset of time functions for each system considered in this work improving the performance in terms of EER (%). This technique offers a suboptimal solution since it does not take into account all the possible feature combinations, although it considers correlations between features.…”
Section: B Feature Extraction and Selectionmentioning
confidence: 99%
“…A number of suboptimal selection methods is also discussed in [1]. Of these, [2] found that the so-called Sequential Forward Floating Search [3] method produced the best results, performing close to optimal and demanding lower computational resources than other methods. This method is a bottom-up search procedure, where the term floating identifies that the number of features changes dynamically, with one feature included and/or excluded, at each iteration.…”
Section: Introductionmentioning
confidence: 99%
“…These algorithms have quadratic complexity, but they perform poorly for nonmonotonic indices. In such cases, sequential floating searches 3 provide better results, though at the cost of a higher computational complexity. Beam search variants of the sequential algorithms 4 are also used to reduce computational complexity.…”
Section: Introductionmentioning
confidence: 99%