2022
DOI: 10.1109/tpami.2021.3135007
|View full text |Cite
|
Sign up to set email alerts
|

Learning and Meshing From Deep Implicit Surface Networks Using an Efficient Implementation of Analytic Marching

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 37 publications
0
5
0
Order By: Relevance
“…Aiming for exact extraction, initial efforts focused on identifying linear regions (Serra et al, 2018) and employing Analytical Marching (Lei & Jia, 2020) to exactly reconstruct the zero-level isosurface. Contemporary approaches involve region subdivision, wherein the regions of the complex are progressively subdivided from neuron to neuron and layer to layer, allowing for the efficient calculation of the exponentially increasing linear regions (Raghu et al, 2017;Hanin & Rolnick, 2019;Humayun et al, 2023;Berzins, 2023).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Aiming for exact extraction, initial efforts focused on identifying linear regions (Serra et al, 2018) and employing Analytical Marching (Lei & Jia, 2020) to exactly reconstruct the zero-level isosurface. Contemporary approaches involve region subdivision, wherein the regions of the complex are progressively subdivided from neuron to neuron and layer to layer, allowing for the efficient calculation of the exponentially increasing linear regions (Raghu et al, 2017;Hanin & Rolnick, 2019;Humayun et al, 2023;Berzins, 2023).…”
Section: Related Workmentioning
confidence: 99%
“…Recent advancements in visualizing deep neural networks (Zhang et al, 2020;Lei & Jia, 2020;Lei et al, 2021;Berzins, 2023) contribute significantly to understanding the intricate structures that define these networks. This progress not only provides valuable insights into their expressivity, robustness, training methodologies, and distinctive geometry, but it also, by leveraging the inherent piecewise linearity in Continuous Piecewise Affine (CPWA) functions, e.g., ReLU neural networks, represents each region as a convex polyhedral, and the assembly of these polyhedral sets constructs a polyhedral complex that delineates the decision boundaries of neural networks (Grigsby & Lindsey, 2022).…”
Section: Introductionmentioning
confidence: 99%
“…Marching" [81] that can extract the exact polygonal mesh from neural implicit representations, the gap between neural implicit and explicit meshes may diminish in the future.…”
Section: Representationsmentioning
confidence: 99%
“…SRNs were first introduced for representing 3D opaque meshes, either as occupancy grids [MON*19, MLL*21] or signed distance field [TLY*21, DNJ20, LJM21, CLI*20]. In these methods, the networks were trained in world‐space, that is, from pairs of position to data value.…”
Section: Related Workmentioning
confidence: 99%