2013
DOI: 10.1109/tpami.2012.63
|View full text |Cite
|
Sign up to set email alerts
|

Laplacian Sparse Coding, Hypergraph Laplacian Sparse Coding, and Applications

Abstract: Sparse coding exhibits good performance in many computer vision applications. However, due to the overcomplete codebook and the independent coding process, the locality and the similarity among the instances to be encoded are lost. To preserve such locality and similarity information, we propose a Laplacian sparse coding (LSc) framework. By incorporating the similarity preserving term into the objective of sparse coding, our proposed Laplacian sparse coding can alleviate the instability of sparse codes. Furthe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
194
0
1

Year Published

2014
2014
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 338 publications
(196 citation statements)
references
References 26 publications
1
194
0
1
Order By: Relevance
“…In addition, in conformity with previous work [13], the Laplacian regularized Lasso is also solved by a modified feature sign search (FSS) method. In this paper, we try to present a fair comparison among all priors.…”
Section: B Models and Methodsmentioning
confidence: 81%
See 1 more Smart Citation
“…In addition, in conformity with previous work [13], the Laplacian regularized Lasso is also solved by a modified feature sign search (FSS) method. In this paper, we try to present a fair comparison among all priors.…”
Section: B Models and Methodsmentioning
confidence: 81%
“…Recently, structured priors have been incorporated into HSI classification [7], which can be sorted into three categories. (a) Priors that only exploit the correlations and dependencies among the neighboring spectral pixels or their sparse coefficient vectors, which includes joint sparsity [12], graph regularized Lasso (referred as the Laplacian regularized Lasso) [13] and the low-rank Lasso [14]. (b) Priors that only exploit the inherent structure of the dictionary, such as group Lasso [15].…”
Section: Introductionmentioning
confidence: 99%
“…Based on the fact that pixels within a local patch (where the center pixel is the target pixel to be processed) have a high probability of being associated with the same thematic class, the sparse representation coefficients of these pixels are also expected to be similar. Therefore, the spatial pooling operation can be considered a reasonable way to merge these similar sparse representation coefficient vectors and yield a new feature vector with better discriminative ability, which is an approach that has been widely used in the literature [33,34].…”
Section: Incorporating Spatial Information With the Spatial Max Poolimentioning
confidence: 99%
“…Afterwards, by combining these features with SPM, the authors propose Kernel Sparse Representation Spatial Pyramid Matching (KSRSPM). Besides this approach, Gao et al [66] explore another sparse codingbased approach (LScSPM) by considering the instable sparse code produced by different sparse coding techniques [60,62]. The authors use Laplacian sparse coding framework to address this issue.…”
Section: Literature Reviewmentioning
confidence: 99%