2020
DOI: 10.48550/arxiv.2004.06093
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Topology of deep neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(9 citation statements)
references
References 0 publications
0
9
0
Order By: Relevance
“…Similar to a recent work that exploits graph-based manifolds for the topology analysis of DNNs (Naitzat et al, 2020) (Zhang et al, 2013).…”
Section: Graph-based Manifold Constructionmentioning
confidence: 95%
See 1 more Smart Citation
“…Similar to a recent work that exploits graph-based manifolds for the topology analysis of DNNs (Naitzat et al, 2020) (Zhang et al, 2013).…”
Section: Graph-based Manifold Constructionmentioning
confidence: 95%
“…Pair-wise distance calculations on the manifold will be key to estimating γ F max . To this end, the geodesic distance metric is arguably the most natural choice: for graph-based manifold problems the shortest-path distance metric have been adopted for approximating the geodesic distance on a manifold in nonlinear dimensionality reduction and neural network topological analysis (Tenenbaum et al, 2000;Naitzat et al, 2020). However, to exhaustively search for γ F max will require computing all-pairs shortest-paths between N input (output) data points, which can be prohibitively expensive even when taking advantage of the state-of-the-art randomized method (Williams, 2018).…”
Section: Computing γ Fmentioning
confidence: 99%
“…Olah performed a number of topological experiments illustrating the importance of considering the topology of the underlying data when making a neural network. In [26] the activations of a binary classification neural network were considered as point clouds that the layer functions of the network are acting on. The topologies of these activations are then studied using homological tools such as persistent homology [10].…”
Section: Previous Workmentioning
confidence: 99%
“…The combined deformation needed to curve the plane and to "make" the holes causes the generative function to be highly nonlinear. Note that when considering a DGM that uses a DNN with ReLU activation functions as generator g(z), it is also possible for g(z) to change topology of the input by "folding" transformations (Naitzat et al, 2020).…”
Section: Deep Generative Models (Dgm) To Represent Realistic Patternsmentioning
confidence: 99%
“…In order to approximate manifolds of realistic patterns, most common DGMs involve (artificial) neural networks with several layers and nonlinear (activation) functions. Recently, it has been shown that deep neural networks with a ReLU activation function are able to change topology of the input (Naitzat et al, 2020) so, besides nonlinearity, the generator of DGMs may also induce changes in topology when mapping from the latent space. When the sole purpose of the DGM is for generating new samples, high nonlinearity and induced changes in topology are not important but they might be an issue when the DGM is used for additional tasks, such as inversion.…”
Section: Introductionmentioning
confidence: 99%