2016
DOI: 10.1049/iet-cvi.2014.0434
|View full text |Cite
|
Sign up to set email alerts
|

Face recognition using supervised probabilistic principal component analysis mixture model in dimensionality reduction without loss framework

Abstract: In this study, first a supervised version for probabilistic principal component analysis mixture model is proposed. Using this model, local linear underlying manifolds of data samples are obtained. These underlying manifolds are used in a dimensionality reduction without loss framework, for face recognition application. In this framework, the benefits of dimensionality reduction are used in the predictive model, while using the projection penalty idea, the loss of useful information will be minimised. The auth… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 24 publications
(12 citation statements)
references
References 26 publications
0
11
0
Order By: Relevance
“…In this study, we have applied the principal component analysis (PCA) method which provides linear mapping based on an eigenvector search. PCA provides different approaches to reduce the feature space dimensionality [35,36]. In this study, the dataset is split into 70 : 30 ratio, i.e., 70% of the data is used for training, while 30% is used for the testing purpose.…”
Section: Methodsmentioning
confidence: 99%
“…In this study, we have applied the principal component analysis (PCA) method which provides linear mapping based on an eigenvector search. PCA provides different approaches to reduce the feature space dimensionality [35,36]. In this study, the dataset is split into 70 : 30 ratio, i.e., 70% of the data is used for training, while 30% is used for the testing purpose.…”
Section: Methodsmentioning
confidence: 99%
“…Preserving as much variability as possible refers to the discovery of new variables that are linear functions of those in the original dataset. These linear functions maximize the variance and are also uncorrelated with each other [26]. Literature on PCA dates back from [8] and also [9] who coined the term principal components.…”
Section: Principal Component Analysis (Pca)mentioning
confidence: 99%
“…Zhao et al [13] suggested a technique for dimensional reduction to the use of a spectral-space-based classification (SSFC) device to reduce the spectral dimensions. Typically, in a random way the most complicated information was taken using the Convolutional Neural Network (CNN) technique.…”
Section: Related Workmentioning
confidence: 99%