2014 IEEE 26th International Conference on Tools With Artificial Intelligence 2014
DOI: 10.1109/ictai.2014.47
|View full text |Cite
|
Sign up to set email alerts
|

A Supervised Feature Selection Algorithm through Minimum Spanning Tree Clustering

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(5 citation statements)
references
References 11 publications
0
5
0
Order By: Relevance
“…Different from other MST based feature selection methods (Liu et al , 2014; Song et al , 2013a), we just use the degree of each node in the MST to obtain the ranking order of each feature, and have not involved any clustering procedure for the feature selection. Compared with the well-known mRMR feature selection method (Peng et al , 2005), our method not only adaptively determines the number of selected features, but also can select the more effective features in discriminating among ROIs from different classes of CISLs.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Different from other MST based feature selection methods (Liu et al , 2014; Song et al , 2013a), we just use the degree of each node in the MST to obtain the ranking order of each feature, and have not involved any clustering procedure for the feature selection. Compared with the well-known mRMR feature selection method (Peng et al , 2005), our method not only adaptively determines the number of selected features, but also can select the more effective features in discriminating among ROIs from different classes of CISLs.…”
Section: Discussionmentioning
confidence: 99%
“…By computing the MST of the weighted graph, we can obtain the importance of each feature and select the most important features to character the ROIs. Different from the other MST based feature selection methods (Song et al 2013a, Liu et al 2014, our method selects the optimized feature over the global scope, not at the local range. Our method selects features from the whole features, while Song et al (2013a) and Liu et al (2014) cluster the whole features into several groups and then choose features from each group.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The first phase uses a minimum spanning tree to cluster all the features, while the second phase selects the most representative features from each cluster to form a subset of features. Liu et al propose a minimum spanning tree-based clustering method, in which measures with information variation are used to assess the correlation and redundancy between features [45]. First, a minimum spanning tree is created based on the correlation between features, and then those long edges are removed to form clusters.…”
Section: Related Workmentioning
confidence: 99%
“…Finally, the feature with the highest correlation with the class label is selected from each cluster. Another study ( Liu et al, 2014 ) under supervised framework similarly used MST for grouping and variation of information for relevance measure. Desired number of features and the pruning rate should be given as inputs in their algorithm.…”
Section: Feature Selection Approachesmentioning
confidence: 99%