2015
DOI: 10.12988/ams.2015.55408
|View full text |Cite
|
Sign up to set email alerts
|

A new approach: the local feature extraction based on the new regulation of the locally preserving projection

Abstract: A novelty of the local feature extraction was proposed for face recognition. To optimize the Eigenvalue and Eigenvector, a new regulation has been embedded to the Locally Preserving Projection. The proposed method has reduced computation time to obtain the new subspace of the original of the Locally Preserving Projection. The proposed method has also produced orthogonal basis function matrix. However, orthogonal basis function matrix can reconstruct easier than non-orthogonal function. The proposed method has … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2015
2015
2017
2017

Publication Types

Select...
5

Relationship

3
2

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 18 publications
(16 reference statements)
0
5
0
Order By: Relevance
“…The calculation results of the S-LST as written in the Equation (5) The results of Equation (10) delivered the Eigenvalue and the Eigenvector, where the Eigenvalues are decreasingly ordered as mentioned in Equation (11) and followed the Eigenvectors shifting based on the corresponding column position to obtain the primary features as written in the Equation (12) m m…”
Section: Compute the New Spaces Of The S-lstmentioning
confidence: 99%
See 1 more Smart Citation
“…The calculation results of the S-LST as written in the Equation (5) The results of Equation (10) delivered the Eigenvalue and the Eigenvector, where the Eigenvalues are decreasingly ordered as mentioned in Equation (11) and followed the Eigenvectors shifting based on the corresponding column position to obtain the primary features as written in the Equation (12) m m…”
Section: Compute the New Spaces Of The S-lstmentioning
confidence: 99%
“…Many approaches have been promoted to overcome the problems, i.e. Covariance-based subspace [1], Eigenface [2][3][4], Fisherface [5][6][7][8][9], Independent Component Analysis [10], Local manifold model [11][12], Laplacian Smoothing Transform (LST) [13][14], Kernel subspace [15], Tangent Space [16], Sparse Neighborhood [17], Wavelet [18], Graph embedding [19], Matrix-based Features [20], Vertical and Horizontal Information [21], Locally Linear Regression [22], and Homogeneous and Non-homogeneous Polynomial Model [23].…”
Section: Introductionmentioning
confidence: 99%
“…The first process of Two-Dimensional Eigenfaces is to compute the average of the training sets as shown in Equation The result of Equation (9) will be applied to compute the zero mean and covariance. They can be written in Equation (10) and (11)…”
Section: Two-dimensional Eigenface Of the Ldmmentioning
confidence: 99%
“…Principally, dimensionality reduction is used to obtain the main features of an object, so that computation time of the similarity measurement can be significantly reduced. Many methods have been developed to overcome high dimensionality, such as the Principal Component Analysis wellknown as PCA [7][8], Linear Discriminant Analysis [7], Kernel PCA [9][10], Linear Preserving Projection or Laplacianfaces [11][12], Gabor Wavelet [13], and Two-Dimensional Fisherface [14][15]. One of the most common and the oldest methods to extract the features is PCA.…”
Section: Introductionmentioning
confidence: 99%
“…Many algorithms have been implemented for dimensionality reduction, which are based on appearance (Shih et al, 2008;Wright et al, 2009;Martinez and Kak, 2001;Yang et al, 2004;Li and Yuan, 2005;Muntasa, 2014;2015a;2015b;Zhang and Zha, 2004;Sanguinetti, 2008), geometrical (Muntasa et al, 2012;Rizvandi et al, 2007;Cootes et al, 2000) and hybrid (Cootes et al, 2000;Tang and Wang, 2003).…”
Section: Introductionmentioning
confidence: 99%