2018 IEEE International Conference on Real-Time Computing and Robotics (RCAR) 2018
DOI: 10.1109/rcar.2018.8621800
|View full text |Cite
|
Sign up to set email alerts
|

Semi-Supervised Subspace Clustering via Non-Negative Low-Rank Representation for Hyperspectral Images

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
1
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 10 publications
0
1
0
Order By: Relevance
“…However, sometimes a few labelled data points might be accessible, which can provide helpful supervised information to guide clustering algorithms to better learn the cluster structure of data. By incorporating supervised information, semi-supervised clustering methods are developed in [54,55,57]. The idea of [54,55] focuses on the refinement of coefficients matrix in self-representation models with supervised information for a more block-diagonal similarity matrix.…”
Section: Object-based Clustering Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…However, sometimes a few labelled data points might be accessible, which can provide helpful supervised information to guide clustering algorithms to better learn the cluster structure of data. By incorporating supervised information, semi-supervised clustering methods are developed in [54,55,57]. The idea of [54,55] focuses on the refinement of coefficients matrix in self-representation models with supervised information for a more block-diagonal similarity matrix.…”
Section: Object-based Clustering Methodsmentioning
confidence: 99%
“…Different from [54,55], the authors of [56,57] propagate the label information in a graph that is obtained by solving a self-representation model. Let X l ∈ R B×l be the labelled data, X u be the unlabelled data, X = [X l , X u ], Y l ∈ R c×l be the one-hot label matrix of X l and F = [F l , F u ] be the predicted label matrix of X.…”
Section: Object-based Clustering Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…With this belief, LRR enforces the coefficient matrix to be a blockdiagonal form and thereby effectively recovers the subspace structures of the data [26]. Due to its robustness and discrimination, LRR has been intensively studied for improvement [31], [39], [41], [42] and has reaped many promising results in applications [28], [30]. As is shown in FIGURE 2, the LRR not only has robustness to noise, but also can capture the discriminative structure of the given data.…”
Section: Low-rank Representationmentioning
confidence: 99%