2022
DOI: 10.48550/arxiv.2202.07352
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Calomplification -- The Power of Generative Calorimeter Models

Sebastian Bieringer,
Anja Butter,
Sascha Diefenbacher
et al.

Abstract: Motivated by the high computational costs of classical simulations, machine-learned generative models can be extremely useful in particle physics and elsewhere. They become especially attractive when surrogate models can efficiently learn the underlying distribution, such that a generated sample outperforms a training sample of limited size. This kind of GANplification has been observed for simple Gaussian models. We show the same effect for a physics simulation, specifically photon showers in an electromagnet… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 50 publications
0
3
0
Order By: Relevance
“…In addition to driving progress in treating phase-space generation through deep networks, NN-generators have the structural advantage that they can be trained on any combination of simulated and/or measured events. They also provide an efficient way to define and ship standardized event samples, including a modest numerical advantage over a statistics-limited training dataset [467,468], and can be used for post-processing of standard simulations. In these functions, it is again crucial to understand and control the precision of these generative networks at the level needed for precision measurements [469,470].…”
Section: Machine Learning Techniquesmentioning
confidence: 99%
“…In addition to driving progress in treating phase-space generation through deep networks, NN-generators have the structural advantage that they can be trained on any combination of simulated and/or measured events. They also provide an efficient way to define and ship standardized event samples, including a modest numerical advantage over a statistics-limited training dataset [467,468], and can be used for post-processing of standard simulations. In these functions, it is again crucial to understand and control the precision of these generative networks at the level needed for precision measurements [469,470].…”
Section: Machine Learning Techniquesmentioning
confidence: 99%
“…Given the interpolation properties of neural networks and the benefits of their implicit bias in the applications described in Sec. 2, we can quantify the amplification of statistics-limited training data through generative networks [64,65].…”
Section: End-to-end Ml-generatorsmentioning
confidence: 99%
“…Neural networks work much like a fit and not like an interpolation in the sense that they do not reproduce the training data faithfully and instead learn a smooth approximation [64,65]. This is where we can gain some intuition for a NN-uncertainty treatment.…”
Section: Control and Precisionmentioning
confidence: 99%