Hyperspectral image compressive sensing reconstruction (HSI-CSR) is an important issue in remote sensing, and has recently been investigated increasingly by the sparsity prior based approaches. However, most of the available HSI-CSR methods consider the sparsity prior in spatial and spectral vector domains via vectorizing hyperspectral cubes along a certain dimension. Besides, in most previous works, little attention has been paid to exploiting the underlying nonlocal structure in spatial domain of the HSI. In this paper, we propose a nonlocal tensor sparse and low-rank regularization (NTSRLR) approach, which can encode essential structured sparsity of an HSI and explore its advantages for HSI-CSR task. Specifically, we study how to utilize reasonably the l 1 -based sparsity of core tensor and tensor nuclear norm function as tensor sparse and low-rank regularization, respectively, to describe the nonlocal spatial-spectral correlation hidden in an HSI. To study the minimization problem of the proposed algorithm, we design a fast implementation strategy based on the alternative direction multiplier method (ADMM) technique. Experimental results on various HSI datasets verify that the proposed HSI-CSR algorithm can significantly outperform existing state-of-the-art CSR techniques for HSI recovery.
Existing methods for tensor completion (TC) have limited ability for characterizing low-rank structures. To depict the complex hierarchical knowledge with implicit sparsity attributes hidden in a tensor, we propose a new multi-layer sparsity-based tensor decomposition (MLSTD) for low-rank tensor completion (LRTC). The method encodes structured sparsity of a tensor by multiple layer representation. Specifically, we use the CANDECOMP/PARAFAC (CP) model to decompose a tensor into an ensemble of sum of rank-1 tensors, and the number of rank-1 component is easily interpreted as a first layer sparsity measure. Presumably, the factor matrices are smooth since local piece-wise property exists in within-mode correlation. In subspace, the local smoothness can be regarded as the second layer sparsity. To describe the refined structures of factor/subspace sparsity, we introduce a new sparsity insight of subspace smoothness: a self-adaptive low-rank matrix factorization scheme, called the third layer sparsity. By progressive description of sparsity structure, we formulate a MLSTD model and embed it into the LRTC problem. Then an effective ADMM algorithm is designed for the MLSTD minimization problem. Various experiments in RGB images, hyperspectral images and videos substantiate the proposed LRTC method is superior to state-of-the-art methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.