2015
DOI: 10.5120/20530-2871
|View full text |Cite
|
Sign up to set email alerts
|

Enhanced Face Recognition based on PCA and SVM

Abstract: Feature Extraction and classification are important aspects of pattern recognition, computer vision. Principal Component Analysis is a well-known feature extraction and data representation technique. But this method is affected by illumination conditions. The combination o PCA an SVM for face recognition is presented in this paper. Before applying Principal Component Analysis preprocessing o images done by using wavelet transform. After PCA is applied or feature extraction. Support Vector Machine is used or cl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 10 publications
0
1
0
Order By: Relevance
“…Generally, there are two types of pattern classification methods [4,5]: parametric methods and nonparametric methods. Parametric methods such as support vector machine (SVM) [6,7] center on how to learn the parameters of a hypothesis classification model from the training samples and then use them to identify the class labels of test samples. In contrast, the nonparametric methods, such as nearest neighbor (NN) [8] and nearest subspace (NS) [9], use the training samples directly to identify the class labels of test samples.…”
Section: Introductionmentioning
confidence: 99%
“…Generally, there are two types of pattern classification methods [4,5]: parametric methods and nonparametric methods. Parametric methods such as support vector machine (SVM) [6,7] center on how to learn the parameters of a hypothesis classification model from the training samples and then use them to identify the class labels of test samples. In contrast, the nonparametric methods, such as nearest neighbor (NN) [8] and nearest subspace (NS) [9], use the training samples directly to identify the class labels of test samples.…”
Section: Introductionmentioning
confidence: 99%