2009
DOI: 10.1109/tpami.2008.258
|View full text |Cite
|
Sign up to set email alerts
|

Asymmetric Principal Component and Discriminant Analyses for Pattern Classification

Abstract: This paper studies the roles of the principal component and discriminant analyses in the pattern classification and explores their problems with the asymmetric classes and/or the unbalanced training data. An asymmetric principal component analysis (APCA) is proposed to remove the unreliable dimensions more effectively than the conventional PCA. Targeted at the two-class problem, an asymmetric discriminant analysis in the APCA subspace is proposed to regularize the eigenvalue that is, in general, a biased estim… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
11
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 112 publications
(12 citation statements)
references
References 20 publications
1
11
0
Order By: Relevance
“…However, applying PCA-based method, the total covariance matrixes does not effectively remove the unreliable dimensions if one class is represented by its training data much better or much worse than the other class. The asymmetric principal component analysis (APCA) [8] and linear subspace learning-based dimensionality reduction [9] alleviate this problem by asymmetrically weighting the class conditional covariance matrices and by considering the important weak composition. Thus, we design an novel APCA-based method to identify the differentially expressed genes between patients with pmAF and the control group by combining the ideas in two works above [8], [9].…”
Section: Methodsmentioning
confidence: 99%
“…However, applying PCA-based method, the total covariance matrixes does not effectively remove the unreliable dimensions if one class is represented by its training data much better or much worse than the other class. The asymmetric principal component analysis (APCA) [8] and linear subspace learning-based dimensionality reduction [9] alleviate this problem by asymmetrically weighting the class conditional covariance matrices and by considering the important weak composition. Thus, we design an novel APCA-based method to identify the differentially expressed genes between patients with pmAF and the control group by combining the ideas in two works above [8], [9].…”
Section: Methodsmentioning
confidence: 99%
“…This bias is most pronounced when the population eigenvalues tend towards equality and is correspondingly less severe when their values are highly disparate. In all cases, this phenomenon becomes more pronounced as the sample size decreases [4446]. Second, if there are very small eigenvalues in the sample covariance matrix, logarithm of these tiny eigenvalues will incur large disturbance that can dramatically degrade the generalization ability.…”
Section: Variants Of the Region Covariance Descriptormentioning
confidence: 99%
“…However, this theoretical proof seems very often to be contradicted in practice as performance does decrease with increasing dimensionality (known as the curse of dimensionality [8]). In [9] it is argued that the proof is based on knowing the data structure of the data generating process, while in practice only a training set is available to inver this knowledge from. Several factors are known to corrupt this inference: incorrect sampling [9], errors in the measurements • A. Hendrikse, R. Veldhuis and L. Spreeuwers are with the Department of EEMCS, University of Twente, Enschede, the Netherlands.…”
Section: Introductionmentioning
confidence: 99%
“…In [9] it is argued that the proof is based on knowing the data structure of the data generating process, while in practice only a training set is available to inver this knowledge from. Several factors are known to corrupt this inference: incorrect sampling [9], errors in the measurements • A. Hendrikse, R. Veldhuis and L. Spreeuwers are with the Department of EEMCS, University of Twente, Enschede, the Netherlands. E-mail: a.j.hendrikse@ewi.utwente.nl [10], [11], modeling errors [12] and errors in the currently used estimators [13], [14].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation