2017
DOI: 10.1109/tnnls.2016.2521602
|View full text |Cite
|
Sign up to set email alerts
|

Robust Joint Graph Sparse Coding for Unsupervised Spectral Feature Selection

Abstract: In this paper, we propose a new unsupervised spectral feature selection model by embedding a graph regularizer into the framework of joint sparse regression for preserving the local structures of data. To do this, we first extract the bases of training data by previous dictionary learning methods and, then, map original data into the basis space to generate their new representations, by proposing a novel joint graph sparse coding (JGSC) model. In JGSC, we first formulate its objective function by simultaneousl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
81
0
1

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 306 publications
(82 citation statements)
references
References 46 publications
0
81
0
1
Order By: Relevance
“…In Tables 5 and 6, the number in parentheses is the number of the selected features that corresponds to the best clustering result. Since we use the same clustering experiment parameter setting with our previous work [21], the clustering results of some compared approaches are the same with [21]. Several interesting points can be observed from Tables 5 and 6.…”
Section: Clustering Results and Analysismentioning
confidence: 96%
See 2 more Smart Citations
“…In Tables 5 and 6, the number in parentheses is the number of the selected features that corresponds to the best clustering result. Since we use the same clustering experiment parameter setting with our previous work [21], the clustering results of some compared approaches are the same with [21]. Several interesting points can be observed from Tables 5 and 6.…”
Section: Clustering Results and Analysismentioning
confidence: 96%
“…The ten comparison approaches include LS [11], MCFS [8], SPEC [12], UDSFS [13], RUFS [16], RSR [18], SPNFSR [20], JGSC [21], NSSRD [23], and MFFS [27]. Meanwhile 4 , 10 5 } on all the databases.…”
Section: Experimental Settingsmentioning
confidence: 99%
See 1 more Smart Citation
“…Existing feature selection algorithm can be categorized as supervised feature selection (on data with full class labels) [5]- [9], unsupervised feature selection (on data without class labels) [10]- [15], and semisupervised feature selection (on data with partial labels) [14], [16], [17]. Feature selection in unsupervised context is considered to be more difficult than the other two cases, since there is no target information available for training.…”
Section: Introductionmentioning
confidence: 99%
“…In [17], a global and local structure preservation framework that integrates global pairwise sample similarity and local geometric data structure is proposed for feature selection. In [15] and [28]- [31], spectral learning aiming to preserve the underlying manifold structure is applied for selecting proper features. In [32], embedding learning and sparse regression are jointly applied to perform feature selection.…”
Section: Introductionmentioning
confidence: 99%