2019
DOI: 10.48550/arxiv.1906.04032
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural Spline Flows

Conor Durkan,
Artur Bekasov,
Iain Murray
et al.

Abstract: A normalizing flow models a complex probability density as an invertible transformation of a simple base density. Flows based on either coupling or autoregressive transforms both offer exact density evaluation and sampling, but rely on the parameterization of an easily invertible elementwise transformation, whose choice determines the flexibility of these models. Building upon recent work, we propose a fully-differentiable module based on monotonic rational-quadratic splines, which enhances the flexibility of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
115
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 83 publications
(128 citation statements)
references
References 18 publications
0
115
0
Order By: Relevance
“…For the normalizing flow in this work, we build on the same basic architecture and training procedure as Ting & Weinberg (2021), using a batch size of 1024 and 500 epochs. This architecture consists of eight units of a "Neural Spline Flow", each of which consists of three layers of densely connected neural networks with 16 neurons, coupled with a "Conv1x1" operation (often referred to as "GLOW" in the machine learning literature; see, e.g., Kingma & Dhariwal 2018;Durkan et al 2019). See Green & Ting (2020); Ting & Weinberg (2021) for additional details, particularly Fig.…”
Section: Inferring the Population Density With Normalizing Flowsmentioning
confidence: 99%
“…For the normalizing flow in this work, we build on the same basic architecture and training procedure as Ting & Weinberg (2021), using a batch size of 1024 and 500 epochs. This architecture consists of eight units of a "Neural Spline Flow", each of which consists of three layers of densely connected neural networks with 16 neurons, coupled with a "Conv1x1" operation (often referred to as "GLOW" in the machine learning literature; see, e.g., Kingma & Dhariwal 2018;Durkan et al 2019). See Green & Ting (2020); Ting & Weinberg (2021) for additional details, particularly Fig.…”
Section: Inferring the Population Density With Normalizing Flowsmentioning
confidence: 99%
“…To estimate the density of the dataset, it is proposed to use normalizing flows, which are primarily applied to generative modeling [48][49][50][51][52][53] and are increasingly being used for scientific applications [54,55]. Given a dataset, generative modeling attempts to create new data points that were previously unseen but are distributed like the original dataset.…”
Section: Probability Map Estimationmentioning
confidence: 99%
“…Importantly, normalizing flows have been shown to perform reasonably well for image generation, which shows that they can learn PDFs in ∼10 3 -dimensional phase-space. Different types of coupling layers can be used for density estimation [48][49][50]. Among them, neural spline flows are chosen, as they have been shown to capture complex multimodal distributions with a small number of trainable parameters [50].…”
Section: Probability Map Estimationmentioning
confidence: 99%
See 1 more Smart Citation
“…However, these assumptions are not always fulfilled (Blundell et al, 2015;Iwata & Ghahramani, 2017;Kuleshov et al, 2018;Lakshminarayanan et al, 2016;Sun et al, 2019). Multiple observation noise models and normalizing flow (Durkan et al, 2019;Gopal & Key, 2021) can generalize to any noise distribution, but it is left to human's expertise to choose appropriate likelihood function. Quantile regression (Gasthaus et al, 2019;Han et al, 2021;Tagasovska & Lopez-Paz, 2018) avoids distributional assumptions and can be flexibly combined with many forecasting models.…”
Section: Related Workmentioning
confidence: 99%