2021
DOI: 10.1109/jstars.2020.3040218
|View full text |Cite
|
Sign up to set email alerts
|

Affinity Matrix Learning Via Nonnegative Matrix Factorization for Hyperspectral Imagery Clustering

Abstract: In this paper, we integrate the spatial-spectral information of HSI samples into non-negative matrix factorization (NMF) for affinity matrix learning to address the issue of HSI clustering. This technique consists of three main components: i) oversegmentation for computing the spectral-spatial affinity matrix, ii) NMF with the guidance of the obtained affinity matrix and iii) density-based spectral clustering on the final affinity matrix. First, the HSI is oversegmented into superpixels via the entropy rate su… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 14 publications
(9 citation statements)
references
References 41 publications
0
9
0
Order By: Relevance
“…Therefore, these methods require O(N 2 ) memory to store the affinity matrix and may suffer from an "out of memory" error in the training phase. Following [7], [12], [13], [17], [19], [23], [50], a subset of these datasets is used in our method for computational efficiency. Particularly, the subset taken from Salinas dataset is also known as Salinas-A dataset.…”
Section: Resultsmentioning
confidence: 99%
“…Therefore, these methods require O(N 2 ) memory to store the affinity matrix and may suffer from an "out of memory" error in the training phase. Following [7], [12], [13], [17], [19], [23], [50], a subset of these datasets is used in our method for computational efficiency. Particularly, the subset taken from Salinas dataset is also known as Salinas-A dataset.…”
Section: Resultsmentioning
confidence: 99%
“…Thus, metrics, or UFE methods, achieving SOTA performance of computing affinity between HSI samples are considered to be candidates for the generation of N2P. Among these methods, local covariance matrix representation, exploiting spectral-spatial information, has been successfully applied to both unsupervised and supervised HSIC [24,25,69]. Compared with SAM and Euclidean distances, it exploited the similarities and variances of local samples and proved to be suitable for measuring distances between HSI samples.…”
Section: Generatation Of N2pmentioning
confidence: 99%
“…Since HSI annotation usually requires extensive field data collection campaigns, that are costly and impractical when the HSI scene involves incooperative areas [24], a few published works focus on unsupervised classification, i.e., clustering, that directly models the intrinsic characteristic of HSI samples to form several clusters [25]. Typical HSI clustering methods include k-means, fuzzy c-means and etc.…”
Section: Introductionmentioning
confidence: 99%
“…However, Spectral GCN Graph Convolutional Networks based Non-Negative Matrix Factorization aware Comm has a high time complexity due to the need to calculate the eigenvalues of the graph Laplacian matrix. To improve the computational efficiency of Spectral GCN, literature [34] proposed the Chebyshev network ChebyNet. ChebyNet uses Chebyshev expansion to approximate convolution operation, which significantly improves the efficiency of the graph convolution operation.…”
Section: Graph Convolutional Networkmentioning
confidence: 99%