2021
DOI: 10.1109/tgrs.2020.3029578
|View full text |Cite
|
Sign up to set email alerts
|

Random Subspace-Based k-Nearest Class Collaborative Representation for Hyperspectral Image Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
10

Relationship

3
7

Authors

Journals

citations
Cited by 34 publications
(11 citation statements)
references
References 41 publications
0
11
0
Order By: Relevance
“…After clustering, any fragments with area less than a predefined threshold are selectively deleted at each iteration process, and then the surrounding pixels of the seeds are merged into a larger superpixel. Each seed point is moved in a (3 × 3) neighborhood until the gradient of the corresponding position is the lowest, so as to prevent seed points from being located on the boundary of the image and affect the overall effect of clustering [33], [48], [49].…”
Section: A Hsi Representation Using Superpixel Segmentationmentioning
confidence: 99%
“…After clustering, any fragments with area less than a predefined threshold are selectively deleted at each iteration process, and then the surrounding pixels of the seeds are merged into a larger superpixel. Each seed point is moved in a (3 × 3) neighborhood until the gradient of the corresponding position is the lowest, so as to prevent seed points from being located on the boundary of the image and affect the overall effect of clustering [33], [48], [49].…”
Section: A Hsi Representation Using Superpixel Segmentationmentioning
confidence: 99%
“…Given the training set X and the testing set Y, the linear combination of training samples X can represent the approximation 𝒚𝒚 � of a testing sample y in CRC as [67]:…”
Section: A Cr-based Classifiers Using In Desmentioning
confidence: 99%
“…A locally adaptive form of nearest neighbor classification (LANN) is proposed here to upgrade the obscenity of dimensionality [17]. An effective metric is used here to compute neighborhoods which determines the local decision boundaries from centroid information, and then shrink neighborhoods in directions orthogonal to these local decision boundaries, and extend them parallel to the boundaries [18][19] [20].…”
Section: Introductionmentioning
confidence: 99%