2014
DOI: 10.1109/tkde.2013.65
|View full text |Cite
|
Sign up to set email alerts
|

Clustering-Guided Sparse Structural Learning for Unsupervised Feature Selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
18
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 244 publications
(19 citation statements)
references
References 27 publications
0
18
0
Order By: Relevance
“…For example, Li et al's robust structured subspace learning (RSSL) [23] uses the 2,1 -norm for sparse feature extraction, combining high-level semantics with low-level, localitypreserving features. In the feature selection algorithm clustering-guided sparse structural learning (CGSSL) by Li et al [24], features are jointly selected using sparse regularization (via the 2,1 -norm) and a non-negative spectral clustering objective. Not only are the selected features sparse; they also are the most discriminative features in terms of predicting the cluster indicators in both the original space and a lower-dimensional subspace on which the data is assumed to lie.…”
Section: Related Workmentioning
confidence: 99%
“…For example, Li et al's robust structured subspace learning (RSSL) [23] uses the 2,1 -norm for sparse feature extraction, combining high-level semantics with low-level, localitypreserving features. In the feature selection algorithm clustering-guided sparse structural learning (CGSSL) by Li et al [24], features are jointly selected using sparse regularization (via the 2,1 -norm) and a non-negative spectral clustering objective. Not only are the selected features sparse; they also are the most discriminative features in terms of predicting the cluster indicators in both the original space and a lower-dimensional subspace on which the data is assumed to lie.…”
Section: Related Workmentioning
confidence: 99%
“…Sparse filtering and EPLS do not attempt to explicitly model the input distribution but focus on the properties of the output distribution instead. In order to remove the redundancies and noise in the high-dimensional features, feature selection technique [36][37] and nonnegative matrix factorzation (NMF) [38] are often adopted to identify a subset of the most useful features.…”
Section: Related Workmentioning
confidence: 99%
“…Zechao Li et al [17] proposed a new unsupervised feature selection algorithm by integrating cluster analysis and sparse structure analysis. In particular, Nonnegative Spectral Clustering is used to learn the label of the input sample more accurately and is also responsible for the function of feature selection.…”
Section: Related Workmentioning
confidence: 99%