2015
DOI: 10.1016/j.engappai.2015.05.005
|View full text |Cite
|
Sign up to set email alerts
|

A graph theoretic approach for unsupervised feature selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
34
0
2

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 92 publications
(37 citation statements)
references
References 53 publications
0
34
0
2
Order By: Relevance
“…In a manner similar to [17], the termination of the iterative search for representative features depends on the threshold δ that dictates the degree of importance of each feature. Nonetheless, instead of employing a predetermined, user-defined value for δ , we consider that its value varies with respect to median value of M κ of LC m × dH m across the entire cluster c κ .…”
Section: A Graph-based Feature Selection Using Node Centrality and Rmentioning
confidence: 99%
See 2 more Smart Citations
“…In a manner similar to [17], the termination of the iterative search for representative features depends on the threshold δ that dictates the degree of importance of each feature. Nonetheless, instead of employing a predetermined, user-defined value for δ , we consider that its value varies with respect to median value of M κ of LC m × dH m across the entire cluster c κ .…”
Section: A Graph-based Feature Selection Using Node Centrality and Rmentioning
confidence: 99%
“…As a representative example, the Graph Clustering with Node Centrality (NC) feature selection method [17] combines the following characteristics: (a) it employs the Pearson product-moment correlation coefficient as the similarity measure between features; (b) it uses a community detection algorithm for grouping the features into clusters; (c) it exploits the Laplacian centrality (LC) [18] for characterizing the connectivity and density around a feature, while taking into account both local as well as global characterization of its importance (centrality) within the cluster; (d) it proposes an iterative search strategy that uses NC for eliminating redundant features from each cluster.…”
Section: A Graph-based Feature Selection Using Node Centrality and Rmentioning
confidence: 99%
See 1 more Smart Citation
“…In the case of S-shaped and V-shaped conversion functions, the number of each dimension feature data belonging to the optimal feature subset is counted, as shown in Figure 7. It can be seen from Figure 7 that the number of the 3rd, 13th, 22nd, 23rd, 24th, and 31st dimensions of the optimal feature subsets is the largest, but the final optimal feature subsets {1, 3,10,11,12,13,14,18,19,21,23,24,29,30,31} do not incorporate all of these higherfrequency features (that is, the optimal feature subset is not a simple combination of features with high frequency). The optimal feature subset does not necessarily include features with high frequency, since the feature subset selected by features of high frequency only may not have the best classification effect.…”
Section: V1mentioning
confidence: 99%
“…Therefore, the methods using this approach are typically fast but they need a threshold as the stopping criterion for feature selection. Several filter-based methods have been proposed in the literature including information gain, 10 gain ratio, 11 term variance, 12 Gini index, 13 Laplacian score, 14 Fisher score, 15 minimal-redundancy-maximal-relevance, 16 random subspace method, 17 relevance-redundancy feature selection, 18 unsupervised feature selection method based on ant colony optimization (UFSACO), 19 relevance-redundancy feature selection based on ant colony optimization (RRFSACO), 20 graph clustering with node centrality for feature selection method (GCNC), 21 and graph clustering based ant colony optimization feature selection method (GCACO). 22 Wrapper-based methods combine feature selection with the design of the classifier and evaluate the feature subsets on the basis of the accuracy of classification.…”
Section: Related Workmentioning
confidence: 99%