2020
DOI: 10.1142/s1793525320500442
|View full text |Cite
|
Sign up to set email alerts
|

Graph approximations to the Laplacian spectra

Abstract: I prove that the spectrum of the Laplace-Beltrami operator with the Neumann boundary condition on a compact Riemannian manifold with boundary admits a fast approximation by the spectra of suitable graph Laplacians on proximity graphs on the manifold, and similar graph approximation works for metric-measure spaces glued out of compact Riemannian manifolds of the same dimension.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(14 citation statements)
references
References 11 publications
0
14
0
Order By: Relevance
“…Re-arranging the terms gives that µ k < λ k + 0.8γ K . This can be verified for all 2 ≤ k ≤ K, and note that the good event E (1) is w.r.t X, and E U B is constructed for fixed k max , and none is for specific k ≤ K. The positive integer kmax is fixed, and the constant γ K is half of the minimum first-K eigen-gaps, defined as in (22). Eigenvalue UB and initial LB are proved for k ≤ K, which guarantees (41).…”
Section: Un-normalized Graph Laplacianmentioning
confidence: 83%
See 3 more Smart Citations
“…Re-arranging the terms gives that µ k < λ k + 0.8γ K . This can be verified for all 2 ≤ k ≤ K, and note that the good event E (1) is w.r.t X, and E U B is constructed for fixed k max , and none is for specific k ≤ K. The positive integer kmax is fixed, and the constant γ K is half of the minimum first-K eigen-gaps, defined as in (22). Eigenvalue UB and initial LB are proved for k ≤ K, which guarantees (41).…”
Section: Un-normalized Graph Laplacianmentioning
confidence: 83%
“…The current result may be extended in several directions. First, for manifold with smooth boundary, the random-walk graph Laplacian recovers the Neumann Laplacian [10], and one can expect to prove the spectral convergence as well, such as in [22]. Second, extension to kernel with variable or adaptive bandwidth [5,9], and other normalization schemes, e.g., bi-stochastic normalization [23,20,36], would be important to improve the robustness against low sampling density and noise in data, and even the spectral convergence as well.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…However, theoretical estimates for spectral data in the literature have been much weaker than this. The standard bound on the bias error in the spectral data has been the naive estimate of O(ε 1/2 ), corresponding to the L p → L p operator error (Hein et al 2005, Shi 2015, Lu 2020. While the decay of the variance error as M → ∞ with ε fixed has been long known as a result of the theory of Glivenko-Cantelli function classes (von Luxburg et al 2004, 2008, Belkin & Niyogi 2007, this approach has yielded only weak quantitative bounds of O(M −1/2 ε −d−3 ) on the variance error (Shi 2015).…”
Section: Introductionmentioning
confidence: 99%