2016
DOI: 10.1109/tpami.2015.2462360
|View full text |Cite
|
Sign up to set email alerts
|

Laplacian Regularized Low-Rank Representation and Its Applications

Abstract: Low-rank representation (LRR) has recently attracted a great deal of attention due to its pleasing efficacy in exploring low-dimensional subspace structures embedded in data. For a given set of observed data corrupted with sparse errors, LRR aims at learning a lowest-rank representation of all data jointly. LRR has broad applications in pattern recognition, computer vision and signal processing. In the real world, data often reside on low-dimensional manifolds embedded in a high-dimensional ambient space. Howe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
152
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 368 publications
(152 citation statements)
references
References 41 publications
0
152
0
Order By: Relevance
“…The issues including the estimation of low-rank networks have attracted extensive consideration late years [14][4] [16]. The low-rank regularize in LRR has a profound connection with the current hypothetical advances on hearty primary segment examination (RPCA) [5][6], which prompts new and intense demonstrating alternatives for some applications.…”
Section: B Low-rank Representationmentioning
confidence: 99%
See 3 more Smart Citations
“…The issues including the estimation of low-rank networks have attracted extensive consideration late years [14][4] [16]. The low-rank regularize in LRR has a profound connection with the current hypothetical advances on hearty primary segment examination (RPCA) [5][6], which prompts new and intense demonstrating alternatives for some applications.…”
Section: B Low-rank Representationmentioning
confidence: 99%
“…As of late, enlivened by the advances of SSC and LRR, numerous chart based subspace grouping calculations have been developed [13] [19][4] [14]. For instance, to protect the complex structure of information, Cai et.…”
Section: Subspace Clustering By Means Of Sparse Priormentioning
confidence: 99%
See 2 more Smart Citations
“…Both the pairwise graph and the hyper-graph regularizer are introduced into the tracking model to learn the global and local structure information. L = D W − Z W is the graph Laplacian matrix [45][46][47] , D W is a diagonal degree matrix whose entries are given D W i, j = j Z W i, j . Z is the weight matrix, which describes the similarity measure between every pair of subtask representations.…”
Section: The Proposed Tracking Modelmentioning
confidence: 99%