2022
DOI: 10.48550/arxiv.2203.03934
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Nonlinear Isometric Manifold Learning for Injective Normalizing Flows

Abstract: To model manifold data using normalizing flows, we propose to employ the isometric autoencoder to design nonlinear encodings with explicit inverses. The isometry allows us to separate manifold learning and density estimation and train both parts to high accuracy. Applied to the MNIST data set, the combined approach generates high-quality images.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 15 publications
0
2
0
Order By: Relevance
“…This results in a model where the likelihood is only defined along the signal space and therefore cannot be expressed as a probability density. However, if the mapping φ(y; θ) is injective, the on-manifold density p M (φ(z); θ) exists and it is tractable for some choice of architecture [6,15,45,43,13]. This function can be extended to off-manifold data-points by applying a pseudo-inverse, i.e.…”
Section: Preliminariesmentioning
confidence: 99%
See 1 more Smart Citation
“…This results in a model where the likelihood is only defined along the signal space and therefore cannot be expressed as a probability density. However, if the mapping φ(y; θ) is injective, the on-manifold density p M (φ(z); θ) exists and it is tractable for some choice of architecture [6,15,45,43,13]. This function can be extended to off-manifold data-points by applying a pseudo-inverse, i.e.…”
Section: Preliminariesmentioning
confidence: 99%
“…Injective flows strive for the same goal by using injective deterministic transformations to map the data to a lower dimensional base density. [6], [43] and [13] train injective flows with a two-steps training procedure in which they alternate manifold and density training, while [45] introduces lower bounds on the injective change of variable. [15] combines injective flows with an additive noise model to account for deviations from the learned manifold using stochastic inversions trained on a variational bound.…”
Section: Related Workmentioning
confidence: 99%