2004
DOI: 10.1016/j.patcog.2003.11.009
|View full text |Cite
|
Sign up to set email alerts
|

Improved support vector classification using PCA and ICA feature space modification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
24
0

Year Published

2005
2005
2016
2016

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 54 publications
(24 citation statements)
references
References 16 publications
0
24
0
Order By: Relevance
“…However, tests showed that the other three datasets did not follow a normal distribution, and the PCA and kPCA methods were not able to successfully classify them. The PCA method also relies on calculating the largest variance in order to determine which components are used, but these are not necessarily the directions of maximum discrimination as there is no attempt to use class information, such as within-class scatter [8,48]. The poor performance of these methods on the Brodatz, Fractal and Abraded datasets means they are not suitable for the classification of tribological surfaces.…”
Section: Classifiers Using Pca and Kpca Methodsmentioning
confidence: 99%
“…However, tests showed that the other three datasets did not follow a normal distribution, and the PCA and kPCA methods were not able to successfully classify them. The PCA method also relies on calculating the largest variance in order to determine which components are used, but these are not necessarily the directions of maximum discrimination as there is no attempt to use class information, such as within-class scatter [8,48]. The poor performance of these methods on the Brodatz, Fractal and Abraded datasets means they are not suitable for the classification of tribological surfaces.…”
Section: Classifiers Using Pca and Kpca Methodsmentioning
confidence: 99%
“…Hence, methods that give independent components are more powerful than those who give uncorrelated components. For instance, principal component analysis (PCA) [4] based on eigenvalue decomposition is a famous method that gives uncorrelated components. But those components are usually not similar enough to source signals, especially for correlated signals.…”
Section: Introductionmentioning
confidence: 99%
“…The processing of vectors in such a space is usually a difficult task due the ''dimensionality curse'' problem. In the last few years, a variety of dimensionality reduction transforms have been applied in the context of face recognition: Karhunen-Loève (KL) transform (Turk and Pentland, 1991) (also known as Principal Component Analysis), Independent Component Analysis (Bartlett et al, 2002;Fortuna and Capson, 2004), and Discriminant Analysis (LDA) (also called Fisher Discriminant Analysis) (Zhao et al, 1998). Some extensions of Principal Component Analysis have been studied and applied to face recognition: Nonlinear Principal Component Analysis (Krame, 1991), Kernel Principal Component Analysis (Yang et al, 2000).…”
Section: Introductionmentioning
confidence: 99%