2008
DOI: 10.1016/s1004-4132(08)60065-1
|View full text |Cite
|
Sign up to set email alerts
|

Support vector classifier based on principal component analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 9 publications
0
3
0
Order By: Relevance
“…To solve the irregular packing problem, Gua et al proposed a packing algorithm based on the PCA methodology which resulted in an increased filling rate, decreased packing time, and increased packing number as compared to the MGA method [63]. Zheng et al proposed a PCA-based support vector classifier and noted an increased identification rate in their heart and adult data sets, as compared to the conventional support vector classifier [64]. Table 4 shows the previous research work conducted using the PCA and how it affected the recognition rate results.…”
Section: Principal Component Analysismentioning
confidence: 99%
“…To solve the irregular packing problem, Gua et al proposed a packing algorithm based on the PCA methodology which resulted in an increased filling rate, decreased packing time, and increased packing number as compared to the MGA method [63]. Zheng et al proposed a PCA-based support vector classifier and noted an increased identification rate in their heart and adult data sets, as compared to the conventional support vector classifier [64]. Table 4 shows the previous research work conducted using the PCA and how it affected the recognition rate results.…”
Section: Principal Component Analysismentioning
confidence: 99%
“…It is based on structural risk minimization criteria, and its topology determined by the support vector. Therefore, it can overcome the shortcomings of artificial neural network based on empirical risk minimization criteria [7], and can solve the problem of small samples, nonlinearity, high dimension, etc. The Least Squares Support Vector Machine (LS-SVM) [8][9][10] is a new extension of the standard SVM, which can change inequality constraint of SVM to equality constraint.…”
Section: Introductionmentioning
confidence: 99%
“…In this paper, firstly, we normalized the data, then we used principal component analysis [8] for dimensionality reduction, and finally classify the multi-class data.…”
Section: The Experimental Datamentioning
confidence: 99%