2000
DOI: 10.1007/s001800000025
|View full text |Cite
|
Sign up to set email alerts
|

A nonlinear PCA based on manifold approximation

Abstract: We address the problem of generalizing Principal Component Analysis (PCA) from the approximation point of view. Given a data set in a high dimensional space, PCA proposes approximations by linear subspaces. These linear models can show some limits when the data distribution is not Gaussian. To overcome these limits, we present Auto-Associative Composite (AAC) models based on manifold approximation. AAC models benefit from interesting theoretical properties, generalizing PCA ones. We take profit of these proper… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
13
0

Year Published

2002
2002
2012
2012

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 14 publications
(13 citation statements)
references
References 20 publications
0
13
0
Order By: Relevance
“…More recent research includes contributions to techniques for functional PCA (see for example Brumback and Rice (1998), Cardot (2000), Cardot et al. (2000), Girard (2000), James et al. (2000), Boente and Fraiman (2000) and He et al .…”
Section: Introductionmentioning
confidence: 99%
“…More recent research includes contributions to techniques for functional PCA (see for example Brumback and Rice (1998), Cardot (2000), Cardot et al. (2000), Girard (2000), James et al. (2000), Boente and Fraiman (2000) and He et al .…”
Section: Introductionmentioning
confidence: 99%
“…), it appears that the "optimal" axis maximizes the mean distance between the projected points. An attractive feature of the index (16) is that its maximization benefits from an explicit solution in terms of the eigenvectors of the empirical covariance matrix V k−1 defined in (13). Friedman et al [15,13], and more recently Hall [21], proposed an index to find clusters or use deviation from the normality measures to reveal more complex structures of the scatter-plot.…”
Section: Computation Of Principal Directionsmentioning
confidence: 99%
“…The matrix M k−1 = (m k−1 i,j ) is a first order contiguity matrix, whose value is 1 when R k−1 j is the nearest neighbor of R k−1 i , 0 otherwise. The upper part of (17) is proportional to the usual projected variance, see (16). The lower part is the distance between the projection of points which are nearest neighbor in R p .…”
Section: Computation Of Principal Directionsmentioning
confidence: 99%
“…In more recent times the arguments have commonly been pressed into service in functional data analysis. For relatively recent contributions of this type, see, for example, Bosq (2000), Cardot (2000), Cardot et al (2000, 2003), Girard (2000), James et al (2000), Kneip and Utikal (2001), Boente and Fraiman (2000), He et al (2003), Yamanishi and Tanaka (2005), Hall et al (2006), Yao and Lee (2006), Cardot (2007), Cai and Hall (2006) and Hall and Hosseini‐Nasab (2006).…”
Section: Introductionmentioning
confidence: 99%