2021
DOI: 10.48550/arxiv.2110.11377
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

CaloFlow II: Even Faster and Still Accurate Generation of Calorimeter Showers with Normalizing Flows

Abstract: Recently, we introduced CaloFlow, a high-fidelity generative model for Geant4 calorimeter shower emulation based on normalizing flows. Here, we present CaloFlow v2, an improvement on our original framework that speeds up shower generation by a further factor of 500 relative to the original. The improvement is based on a technique called Probability Density Distillation, originally developed for speech synthesis in the ML literature, and which we develop further by introducing a set of powerful new loss terms. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
4

Relationship

4
5

Authors

Journals

citations
Cited by 22 publications
(25 citation statements)
references
References 22 publications
(55 reference statements)
0
25
0
Order By: Relevance
“…However, we take a more conservative perspective here. been shown to precisely learn complex distributions in particle physics [33][34][35][36][37][38][39][40][41][42][43]. The statistical benefits of using generative models are discussed in Ref.…”
Section: Online Trainingmentioning
confidence: 99%
“…However, we take a more conservative perspective here. been shown to precisely learn complex distributions in particle physics [33][34][35][36][37][38][39][40][41][42][43]. The statistical benefits of using generative models are discussed in Ref.…”
Section: Online Trainingmentioning
confidence: 99%
“…Theory-driven ML-generators at the parton level [17,68] can be combined with experimentdriven fast detector simulations [69][70][71][72][73][74][75][76][77][78][79][80] into single generative networks [67,[81][82][83][84][85], provided we have sufficient control over the network and its uncertainties. Single, soup-to-nuts simulation networks are inspired by the fundamental goal of the detection process, namely to reconstruct parton-level information as accurately as possible.…”
Section: Fast Generative Networkmentioning
confidence: 99%
“…Inverse Autoregressive Flows (IAFs) [148] are fast in sampling, but a factor d slower in estimating the density of data points. Only recently NFs (based on MAF and IAF architectures) have been applied to calorimeter shower simulation [149,150], surpassing the quality of showers generated by an older GAN trained on the same dataset [122,123].…”
Section: Learning the Simulated Particle Interactions With Matter Thr...mentioning
confidence: 99%
“…The first generative model capable of generating samples that would confuse a classifier were based on normalizing flows [149,150]. This approach, called CaloFlow, used the same detector geometry as the CaloGAN.…”
Section: Hybrid Calorimeter For a Future Particle Collidermentioning
confidence: 99%