2019
DOI: 10.2140/paa.2019.1.515
|View full text |Cite
|
Sign up to set email alerts
|

Connecting dots: from local covariance to empirical intrinsic geometry and locally linear embedding

Abstract: Local covariance structure under the manifold setup has been widely applied in the machine learning society. Based on the established theoretical results, we provide an extensive study of two relevant manifold learning algorithms, empirical intrinsic geometry (EIG) and the locally linear embedding (LLE) under the manifold setup. Particularly, we show that without an accurate dimension estimation, the geodesic distance estimation by EIG might be corrupted. Furthermore, we show that by taking the local covarianc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
1
1

Relationship

2
4

Authors

Journals

citations
Cited by 14 publications
(15 citation statements)
references
References 28 publications
0
15
0
Order By: Relevance
“…As is discussed in [15], since I p,r (C x + cI p×p ) −1 I p,r can be viewed as the "regularized precision matrix", we can view (ι(z) − ι(x)) I p,r (C x + cI p×p ) −1 I p,r (ι(y) − ι(x)) as the local Mahalanobis distance between z and y, or the distance between the latent variables related to z and y. Thus, when α = 0, the kernel comes from averaging out the pairwise local Mahalanobis distance, and hence depends on the local geometric structure.…”
Section: 5mentioning
confidence: 97%
See 3 more Smart Citations
“…As is discussed in [15], since I p,r (C x + cI p×p ) −1 I p,r can be viewed as the "regularized precision matrix", we can view (ι(z) − ι(x)) I p,r (C x + cI p×p ) −1 I p,r (ι(y) − ι(x)) as the local Mahalanobis distance between z and y, or the distance between the latent variables related to z and y. Thus, when α = 0, the kernel comes from averaging out the pairwise local Mahalanobis distance, and hence depends on the local geometric structure.…”
Section: 5mentioning
confidence: 97%
“…It has been widely applied in different fields, and has been cited more than 12,500 times by the end of 2018 (according to Google Scholar). However, its theoretical justification was only made available at the end of 2017 [29,15]. Essentially, the established theory says that under the manifold setup, LLE has several peculiar behaviors that are very different from those of diffusion-based algorithms, including eigenmap, DM and VDM.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…We mention that this approach leads to a more accurate geodesic distance estimation by a direct hard threshold of the noisy covariance matrix to remove the influence of noise. A more general discussion can be found in [26].…”
Section: Local Mahalanobis Distance and Empirical Intrinsic Geometrymentioning
confidence: 99%