2016
DOI: 10.1109/tgrs.2016.2517242
|View full text |Cite
|
Sign up to set email alerts
|

Kernel Low-Rank and Sparse Graph for Unsupervised and Semi-Supervised Classification of Hyperspectral Images

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
31
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 66 publications
(31 citation statements)
references
References 44 publications
0
31
0
Order By: Relevance
“…The optimal solutions are acquired by solving generalized eigenvalue problems in the same manner [78]. These methods can be further expanded to shape a manifold learning method by using the kernel trick, similarly to the approaches in [79]. Table II and Fig.…”
Section: Semisupervised Drmentioning
confidence: 99%
“…The optimal solutions are acquired by solving generalized eigenvalue problems in the same manner [78]. These methods can be further expanded to shape a manifold learning method by using the kernel trick, similarly to the approaches in [79]. Table II and Fig.…”
Section: Semisupervised Drmentioning
confidence: 99%
“…structure (by the low-rankness) and the locally linear structure (by the sparseness) of data, hence it is both generative and discriminative. Morsier et al [27] presented a kernel low-rank and sparse graph, which was based on sample proximities in reproducing kernel Hilbert spaces and expressed sample relationships under sparse and low-rank constraints. However, data class structure is not considered in the above methods.…”
Section: Hsi Classification Based On Sblsmentioning
confidence: 99%
“…Recently, deep learning (DL) is found to be able to automatically learn representative features from data via stacking multi-layer nonlinear units [10,11], making successful application on HSI Morsier et al [27] presented a kernel low-rank and sparse graph, which was based on sample proximities in reproducing kernel Hilbert spaces and expressed sample relationships under sparse and low-rank constraints. However, data class structure is not considered in the above methods.…”
Section: Introductionmentioning
confidence: 99%
“…One of the state-of-the-art algorithms is the transductive support vector machine (TSVM) [19][20][21]. (3) Graph-based methods [22][23][24][25][26] that utilize labeled and unlabeled samples to construct graphs and minimize the energy function, and thus, assigning labels to unlabeled samples. (4) Wrapper-based methods, which apply a supervised learning method iteratively and a certain amount of unlabeled samples are labeled in each iteration.…”
Section: Introductionmentioning
confidence: 99%