2009
DOI: 10.1016/j.patcog.2008.11.018
|View full text |Cite
|
Sign up to set email alerts
|

An improvement on floating search algorithms for feature subset selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
54
0
3

Year Published

2010
2010
2022
2022

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 119 publications
(65 citation statements)
references
References 14 publications
0
54
0
3
Order By: Relevance
“…* Forward methods: starting with an empty set of features, they add one feature at each step until getting the optimal subset of features [195] [185]. * Backward methods: starting from a set containing all the features, they discard one feature at each step until getting the optimal set of features [165] [207] [210]. * Random methods: they can try several subsets of features in a random way, trying to avoid to be trapped in local optima (i.e.…”
Section: Feature Selectionmentioning
confidence: 99%
“…* Forward methods: starting with an empty set of features, they add one feature at each step until getting the optimal subset of features [195] [185]. * Backward methods: starting from a set containing all the features, they discard one feature at each step until getting the optimal set of features [165] [207] [210]. * Random methods: they can try several subsets of features in a random way, trying to avoid to be trapped in local optima (i.e.…”
Section: Feature Selectionmentioning
confidence: 99%
“…The benefit of ASFFS is that it provides a less redundant subset than the SFFS algorithm. Nakariyakul and Casasent [14] came up with an improved forward floating search algorithm, which has a new search step to check whether to replace a weak feature and remove it again until the replacement can no longer improve the criterion function. They found that this method obtained optimal solutions for many feature subsets and was less computationally intensive than exhaustive search optimal feature selection algorithms.…”
Section: Floating Search Methodsmentioning
confidence: 99%
“…Hence, wrapper approaches are frequently used [20] which select the features based on the performance of classification algorithm. However, E. Tuv, et al report that, due to the procedural complexity involved in both filters and wrapper, Feature Ensembles can be an alternate strategy in feature subsetting [21].As advancements to the aforesaid work, few research works are carried out using heuristic approaches like Sequential Floating Forward Selection, Sequential Floating Backward Elimination, Genetic Algorithm [23,24,25,26] and meta heuristic approach [27] for selecting the features. It is also observed that when no additional information is provided for the feature selection, then Rough Set Theory can also play a significant role in determining dispensable features [38,39].…”
Section: Fig 1 Various Existing Feature Selection Strategiesmentioning
confidence: 99%