2016
DOI: 10.1109/tcyb.2015.2457611
|View full text |Cite
|
Sign up to set email alerts
|

Low-Rank Preserving Projections

Abstract: As one of the most popular dimensionality reduction techniques, locality preserving projections (LPP) has been widely used in computer vision and pattern recognition. However, in practical applications, data is always corrupted by noises. For the corrupted data, samples from the same class may not be distributed in the nearest area, thus LPP may lose its effectiveness. In this paper, it is assumed that data is grossly corrupted and the noise matrix is sparse. Based on these assumptions, we propose a novel dime… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
46
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 144 publications
(46 citation statements)
references
References 41 publications
0
46
0
Order By: Relevance
“…In this section, we evaluate our algorithm on several well-known databases (AR, Extended YaleB, LFWcrop, Fifteen Scene Categories and Caltech101) whose details are presented in Table 1. We compare our method with PCA [28], LPP [11], NPE [14] and LRPP [21]. For LPP, NPE, LRPP and our method, we first use PCA to reduce dimensionality of all original datasets (except for Fifteen Scene Category database) to be 200, and then extract feature by these four methods, respectively.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this section, we evaluate our algorithm on several well-known databases (AR, Extended YaleB, LFWcrop, Fifteen Scene Categories and Caltech101) whose details are presented in Table 1. We compare our method with PCA [28], LPP [11], NPE [14] and LRPP [21]. For LPP, NPE, LRPP and our method, we first use PCA to reduce dimensionality of all original datasets (except for Fifteen Scene Category database) to be 200, and then extract feature by these four methods, respectively.…”
Section: Methodsmentioning
confidence: 99%
“…For example, Xu et al [20] tried to preserve the global and local structures of data by imposing joint low-rank and sparse constraints on the reconstruction coefficient matrix. Lu et al [21] proposed a method named low-rank preserving projections (LRPP) which learns a low-rank weight matrix by projecting the data on a low-dimensional subspace. In addition, LRPP advocates the uses of the L21 norm as a sparse constraint on the noise matrix and the nuclear norm as a low-rank constraint on the weight matrix, which preserve the global structure of the data during the dimensionality reduction procedure.…”
Section: Introductionmentioning
confidence: 99%
“…To achieve an efficient solution, the iterative singular value thresholding scheme [60,61], originally proposed for unclear norm minimization, can be extended to solve (13) which has a non-convex low-rank constraint. Let trueP˜k=Lk+Z1k/α1, the solution Pk+1 is given by: Pk+1=i=1min{NxNy,Nt}max(σiμ1σip1α1,0)uiviT where the superscript T denotes the transpose (conjugate transpose) operator for real (complex) matrices or vectors.…”
Section: K-t Ncrpca: Formulationmentioning
confidence: 99%
“…Recently, the representation-based algorithms have also been introduced to the GE framework to construct various similarity graphs [31]. For example, Sparse Representation (SR), Collaborative Representation (CR) and Low Rank Representation (LRR) [32] are utilized to constitute the sparse graph ( 1 graph), collaborative graph ( 2 graph) and low-rank graph, leading to Sparsity Preserving Projection (SPP) [33], Collaborative Representation based Projection (CRP) [34] and Low Rank Preserving Projections (LRPP) [35], respectively.…”
Section: Introductionmentioning
confidence: 99%