Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2009
DOI: 10.1007/978-3-642-05258-3_54
|View full text |Cite
|
Sign up to set email alerts
|

Hybridization of Evolutionary Mechanisms for Feature Subset Selection in Unsupervised Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0
5

Year Published

2011
2011
2021
2021

Publication Types

Select...
3
2

Relationship

3
2

Authors

Journals

citations
Cited by 6 publications
(12 citation statements)
references
References 18 publications
0
7
0
5
Order By: Relevance
“…Please note that we do not make comparisons with the GA published in [15], because the authors did not provide the algorithm to make comparisons with the proposal Hill Climbing algorithm. Besides, the only dataset presented in [15], has 29 features, which is regarded as relatively small Basic Matrix.…”
Section: Remarkmentioning
confidence: 98%
See 2 more Smart Citations
“…Please note that we do not make comparisons with the GA published in [15], because the authors did not provide the algorithm to make comparisons with the proposal Hill Climbing algorithm. Besides, the only dataset presented in [15], has 29 features, which is regarded as relatively small Basic Matrix.…”
Section: Remarkmentioning
confidence: 98%
“…Besides, the only dataset presented in [15], has 29 features, which is regarded as relatively small Basic Matrix. The number of rows of Basic Matrix above is great.…”
Section: Remarkmentioning
confidence: 99%
See 1 more Smart Citation
“…Regularly, Featured Subset Selection (FSS) [13] is used to reduce dimensionality [14], which is used to efficiently reduce the number of variables, attributes or characteristics with which should describe the objects and to find their influence in a problem. This is an alternative method that starts by using the set of typical testors, taking out irrelevant or redundant features [11,14].…”
Section: Featured Subset Selectionmentioning
confidence: 99%
“…FSS really has importance because reducing the number of features may help to decrease the cost of acquiring data and also make the classification models easier to understand [14,15]. Also, the number of features could affect the accuracy of classification.…”
Section: Featured Subset Selectionmentioning
confidence: 99%