2010
DOI: 10.1142/s0129065710002346
|View full text |Cite
|
Sign up to set email alerts
|

Data Compression and Regression Through Local Principal Curves and Surfaces

Abstract: (2010) 'Data compression and regression through local principal curves and surfaces.', International journal of neural systems., 20 (3). pp. 177-192. Further information on publisher's website:http://dx.doi.org/10.1142/S0129065710002346Publisher's copyright statement:Additional information: Use policyThe full-text may be used and/or reproduced, and given to third parties in any format or medium, without prior permission or charge, for personal research or study, educational, or not-for-prot purposes provid… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2011
2011
2023
2023

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(11 citation statements)
references
References 16 publications
0
11
0
Order By: Relevance
“…For instance, the local principal surface algorithm [7] does make use of the mean shift as the essential tool of estimation too, so that the methods proposed here should straightforwardly extend to this case, and preliminary investigations brought encouraging results.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…For instance, the local principal surface algorithm [7] does make use of the mean shift as the essential tool of estimation too, so that the methods proposed here should straightforwardly extend to this case, and preliminary investigations brought encouraging results.…”
Section: Discussionmentioning
confidence: 99%
“…Each data point x i can be projected onto the curve and represented ("compressed") through a univariate projection index λ i . Details on these techniques are irrelevant for the presentation of this paper [7]. If H = h 2 I p , one can show that, asymptotically [9],…”
Section: Local Principal Curves As Estimates Of Density Ridgesmentioning
confidence: 99%
“…The extension of PCA to curvilinear spaces is called 'manifold learning', a type of unsupervised machine learning technique , Einbeck et al 2010; it includes two types of method, the nonlinear PCA (Scholz et al 2008) and principal manifold techniques (Gorban and Zinovyev 2008). Some have been used to represent objects in space ranging from molecules (Gorban and Zinovyev 2008) to anatomical structures (Failmezger et al 2013).…”
Section: Constructing Alternative Hypotheses In Real World Scenariosmentioning
confidence: 99%
“…It highlights that the principal curve minimizes the sum of orthogonal distances, while a spline model fit tries to minimize the sum of distances parallel to the y axis. The principal surface algorithm and its extensions (Dong and McAvoy 1996; Einbeck et al 2010; Gerber et al 2009; Goldsmith et al 2011a; Jung et al 2011; Leblanc and Tibshirani 1994; Ozertem and Erdogmus 2011), do not provide a consistent parametrization in the 2D space (see Section 3). ISOMAP (Tenenbaum et al 2000) and Maximum Variance Unfolding (MVU) (Weinberger and Saul 2006) are methods for dimension reduction.…”
Section: Introductionmentioning
confidence: 99%