2022
DOI: 10.1109/lgrs.2021.3070074
|View full text |Cite
|
Sign up to set email alerts
|

Grouped Collaborative Representation for Hyperspectral Image Classification Using a Two-Phase Strategy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 14 publications
0
5
0
Order By: Relevance
“…S7: Pre-classify the features by SVM to assign appropriate feature weights to each future. S8: Superpixel-guided RCR classification for each superpixel region k up to K 1) Solve the RCR problem from ( 16) to (20) and find coefficient matrices {X v k }. 2) Evaluate the residuals by (21).…”
Section: E Relaxed Collaborative Representationmentioning
confidence: 99%
See 2 more Smart Citations
“…S7: Pre-classify the features by SVM to assign appropriate feature weights to each future. S8: Superpixel-guided RCR classification for each superpixel region k up to K 1) Solve the RCR problem from ( 16) to (20) and find coefficient matrices {X v k }. 2) Evaluate the residuals by (21).…”
Section: E Relaxed Collaborative Representationmentioning
confidence: 99%
“…Zhang et al indicated that collaborative representation of training samples is important as sparsity. Therefore, several CRC-based HSIC methods have been proposed [16]- [20]. [16] uses a distance-weighted Tikhonov regularization Manuscript received 14 Dec 2023.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…e first part refers to supervised learning-based methods that require an independent training process and predict testing data using an already learned model. e representative supervised models include sparse/collaborative representation [8][9][10], support vector machine [11][12][13], ensemble learning [14][15][16][17], and so on. e second part refers to unsupervised learningbased models that do not demand training samples and determine entire classes by considering the correlations among samples.…”
Section: Introductionmentioning
confidence: 99%
“…Unsupervised learning, also known as the clustering method, does not need training samples but only interprets the data by exploring the structure and correlation information between the input data, including K-means [16], ISODATA [17], DBSCAN [18], Fuzzy C-Means [19], etc. Supervised learning needs a group of training samples to train the model, which has obtained the optimal model parameters, and then transplants the trained model to the test samples to observe the behaviors of test samples, including sparse/collaborative representation [20], ensemble learning [21], support vector machine [22]. Unlike unsupervised and supervised learning, semisupervised learning introduces some unlabeled samples into the training process for the sake of improving the robustness of method.…”
Section: Introductionmentioning
confidence: 99%