Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
Proceedings of the Twenty-Sixth Annual Symposium on Computational Geometry 2010
DOI: 10.1145/1810959.1811006
|View full text |Cite
|
Sign up to set email alerts
|

Topological inference via meshing

Abstract: We apply ideas from mesh generation to improve the time and space complexities of computing the full persistent homological information associated with a point cloud P in Euclidean space R d . Classical approaches rely on theČech, Rips, α-complex, or witness complex filtrations of P , whose complexities scale up very badly with d. For instance, the α-complex filtration incurs the n Ω(d) size of the Delaunay triangulation, where n is the size of P . The common alternative is to truncate the filtrations when the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
28
0

Year Published

2013
2013
2018
2018

Publication Types

Select...
3
2
2

Relationship

5
2

Authors

Journals

citations
Cited by 19 publications
(28 citation statements)
references
References 28 publications
0
28
0
Order By: Relevance
“…A different approach to the problem of building sparse filtrations for offsets of point clouds in Euclidean space was presented by Hudson et al [22]. They used ideas from Delaunay refinement mesh generation to build linear-size filtrations that provide provably good approximations to the persistence diagram of the offsets.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…A different approach to the problem of building sparse filtrations for offsets of point clouds in Euclidean space was presented by Hudson et al [22]. They used ideas from Delaunay refinement mesh generation to build linear-size filtrations that provide provably good approximations to the persistence diagram of the offsets.…”
Section: Related Workmentioning
confidence: 99%
“…As a proof of concept, we ran our code on the so-called Clifford data set from [22], which was obtained by evenly spacing 2,000 points along the line l : y = 20x mod 2π in the second flat torus (R mod 2π) 2 , then mapping the points onto the Clifford torus in R 4 via the embedding f : (u, v) → (cos u, sin u, cos v, sin v). This data set admits three nontrivial candidate underlying spaces: at small scales, the image of l through f , which is a closed helicoidal curve on the torus; at larger scales, the torus itself; at even larger scales, the 3-sphere of radius √ 2 on which the torus is sitting.…”
Section: Manufactured Datamentioning
confidence: 99%
See 1 more Smart Citation
“…Delaunay meshes are used extensively in graphics and scientific computing, and more recently, they have been applied in higher dimensions for data analysis [3,10]. However, there are serious di culties in constructing meshes in more than three dimensions.…”
Section: Introductionmentioning
confidence: 99%
“…This includes primarily filtrations constructed on grids or quality meshes. Hudson et al [HMOS10] show that the persistent homology of the Euclidean distance to a point cloud can be approximated on the type of mesh used in finite element analysis. Sheehy [She11] extends this result to a large class of Lipschitz functions.…”
mentioning
confidence: 99%