2020
DOI: 10.48550/arxiv.2001.05486
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

i-flow: High-dimensional Integration and Sampling with Normalizing Flows

Christina Gao,
Joshua Isaacson,
Claudius Krause

Abstract: In many fields of science, high-dimensional integration is required. Numerical methods have been developed to evaluate these complex integrals. We introduce the code i-flow, a python package that performs high-dimensional numerical integration utilizing normalizing flows. Normalizing flows are machine-learned, bijective mappings between two distributions. i-flow can also be used to sample random points according to complicated distributions in high dimensions. We compare i-flow to other algorithms for high-dim… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
21
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 14 publications
(21 citation statements)
references
References 33 publications
0
21
0
Order By: Relevance
“…Technically, our BayesFlow approach [47] is based on the conditional version [48,49] of invertible networks (INNs) [50][51][52], a specific realization of normalizing flows [53][54][55][56]. These networks have been studied in relation to phase space generation [57][58][59][60], event generation [61], anomaly detection [62], detector and parton shower unfolding [63], and density estimation [64]. We will introduce our QCD inference framework in Sec.…”
Section: Introductionmentioning
confidence: 99%
“…Technically, our BayesFlow approach [47] is based on the conditional version [48,49] of invertible networks (INNs) [50][51][52], a specific realization of normalizing flows [53][54][55][56]. These networks have been studied in relation to phase space generation [57][58][59][60], event generation [61], anomaly detection [62], detector and parton shower unfolding [63], and density estimation [64]. We will introduce our QCD inference framework in Sec.…”
Section: Introductionmentioning
confidence: 99%
“…• We would like to point out that our argument is not pertinent to parton-level ML-based generative approaches which learn directly from an oracle which can be queried for the underlying true distribution (matrix-element), instead of learning from data generated under that distribution [34][35][36][37][38][39][40].…”
Section: Caveatsmentioning
confidence: 99%
“…This approach has been investigated in many recent work; see e.g. Gao et al (2020); Noé et al (2019); Wirnsberger et al (2020). Although it is attractive, it is also well-known that optimizing this 'mode-seeking' KL can lead to an approximation of the target T # π 0 which has thinner tails than the target π and ignore some of its modes; see e.g.…”
Section: Introductionmentioning
confidence: 99%