2020
DOI: 10.1109/tcyb.2020.2983102
|View full text |Cite
|
Sign up to set email alerts
|

Weighted Low-Rank Tensor Recovery for Hyperspectral Image Restoration

Abstract: Hyperspectral imaging, providing abundant spatial and spectral information simultaneously, has attracted a lot of interest in recent years. Unfortunately, due to the hardware limitations, the hyperspectral image (HSI) is vulnerable to various degradations, such noises (random noise, HSI denoising), blurs (Gaussian and uniform blur, HSI deblurring), and down-sampled (both spectral and spatial downsample, HSI super-resolution). Previous HSI restoration methods are designed for one specific task only. Besides, mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
68
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 142 publications
(68 citation statements)
references
References 99 publications
0
68
0
Order By: Relevance
“…11. Spectral curves of the reconstructed HR-HSIs using the real dataset, (a) Pixel [40,180] in compared results and Pixel [10,45] in LR-HSI, (b) Pixel [120,120] in compared results and Pixel [30,30] in LR-HSI, (c) Pixel [168,60] in compared results and Pixel [42,15] in LR-HSI.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…11. Spectral curves of the reconstructed HR-HSIs using the real dataset, (a) Pixel [40,180] in compared results and Pixel [10,45] in LR-HSI, (b) Pixel [120,120] in compared results and Pixel [30,30] in LR-HSI, (c) Pixel [168,60] in compared results and Pixel [42,15] in LR-HSI.…”
Section: Discussionmentioning
confidence: 99%
“…It regards the HSI as a three-order tensor, and decomposes the tensor into a core tensor multiplication by factor matrices representing the three dimensions. Yi et al [42] proposed a weighted low-rank tensor recovery (WLRTR) model that treated the singular values differently. Similar to the nonlocal self-similarity across space (NSS) method [43], the nonlocal similarity between spectral-spatial cubic and spectral correlation can be characterized in tensors.…”
Section: Introductionmentioning
confidence: 99%
“…To explore nonlocal similarity inside an image, the patch length in spatial domain is usually very small (for examples, 4 × 4 and 6 × 6). This makes it difficult to accurately extract the intrinsic subspace bases of the spatial information for the HOSVD [42]. Meanwhile, we cannot afford the computational cost and memory load of larger patch length in the spatial horizontal and vertical model.…”
Section: C Non-local Similar Cubes Matchingmentioning
confidence: 99%
“…After the non-local low-rank modeling was first introduced to HSI denoising in [31], the flowchart of the non-local based methods become fixed: FBPs grouping and low-rank tensor approximation. Almost all the researchers focused on the low-rank tensor modeling of NLFBPGs, such as tucker decomposition [31], sparsity regularized tucker decomposition [39], Laplacian scale mixture low-rank modeling [14], and weighted low-rank tensor recovery [9] to exploit the spatial non-local similarity and spectral low-rank property simultaneously. However, with the increase of spectral number, the computational burden also increases significantly, impeding the application of these methods to the real high-spectrum HSIs.…”
Section: Spatial: Non-local Similaritymentioning
confidence: 99%
“…The following methods are used for the comparison: spectral low-rank methods, i.e. LRTA [33] 9 ; and finally NGmeet 10 (Algorithm 1), which combines the best of above two fields. Hyper-parameters of all compared methods are set based on authors' codes or suggestions in the paper.…”
Section: Simulated Experimentsmentioning
confidence: 99%