2019
DOI: 10.3390/jimaging5110085
|View full text |Cite
|
Sign up to set email alerts
|

Endmember Learning with K-Means through SCD Model in Hyperspectral Scene Reconstructions

Abstract: This paper proposes a simple yet effective method for improving the efficiency of sparse coding dictionary learning (DL) with an implication of enhancing the ultimate usefulness of compressive sensing (CS) technology for practical applications, such as in hyperspectral imaging (HSI) scene reconstruction. CS is the technique which allows sparse signals to be decomposed into a sparse representation “a” of a dictionary D u . The goodness of the learnt dictionary has direct impacts on the quality of the end… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 31 publications
0
5
0
Order By: Relevance
“…In order to statistically evaluate the various sparse deconvolution algorithms from section 3 for TOF estimation application, 300 pseudorandom signals are generated for each algorithm and for each SNR, and then TOF estimation is performed according to section 4. In the following, the Greedy algorithms NNOMP [41,42], FNNOMP [43] and sNMF [26] of the family of OMP as well as NNOLS and sNNOLS [41,42] of the family of OLS and the convex approach NLARS [44] are tested. In particular, the influence of the k reconstructed peaks in h is investigated.…”
Section: Comparison Of Deconvolution Methodsmentioning
confidence: 99%
“…In order to statistically evaluate the various sparse deconvolution algorithms from section 3 for TOF estimation application, 300 pseudorandom signals are generated for each algorithm and for each SNR, and then TOF estimation is performed according to section 4. In the following, the Greedy algorithms NNOMP [41,42], FNNOMP [43] and sNMF [26] of the family of OMP as well as NNOLS and sNNOLS [41,42] of the family of OLS and the convex approach NLARS [44] are tested. In particular, the influence of the k reconstructed peaks in h is investigated.…”
Section: Comparison Of Deconvolution Methodsmentioning
confidence: 99%
“…Nevertheless, these NMF methods are highly sensitive to their initialization, especially when applied to an entire image subject to spectral variability [16,35,36]. New approaches have also been proposed, like in the K-Means Sparse Coding Dictionary (KMSCD) algorithm [37], which relies on a K-means clustering [38] to initialize a Sparse Coding Dictionary method [39]. These approaches exhibit the benefit of a clustering step in hyperspectral unmixing.…”
Section: Roof Tilementioning
confidence: 99%
“…They treat the fusion of LR HSI and HR MSI as an ill-posed inverse problem. Equations are first established by observation model [21][22][23] and then constrained by many handcraft priors. Popular priors contain the sparse prior [16,18,19] and lowrankness prior [24,25].…”
Section: Introductionmentioning
confidence: 99%
“…These equations are solved by iterative optimization methods such as alternating direction method of multipliers (ADMM) [26] and gradient descent algorithm [27]. Dictionary learning methods [22,23] are a representative kind of VM-based methods. By using sparse representation, they can combine the dictionaries from LR HSI and the high-resolution sparse coefficients from HR MSI to obtain HR HSI.…”
Section: Introductionmentioning
confidence: 99%