2019
DOI: 10.48550/arxiv.1901.00875
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Event Generation and Statistical Sampling for Physics with Deep Generative Models and a Density Information Buffer

Abstract: We present a study for the generation of events from a physical process with deep generative models. The simulation of physical processes requires not only the production of physical events, but also to ensure these events occur with the correct frequencies. We investigate the feasibility of learning the event generation and the frequency of occurrence with Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs) to produce events like Monte Carlo generators. We study three processes: a simpl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
44
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
4

Relationship

2
7

Authors

Journals

citations
Cited by 26 publications
(44 citation statements)
references
References 49 publications
0
44
0
Order By: Relevance
“…We begin by showing a linear interpolation of QCD and W boson jets in Figures 12 and 14 and the difference between the two classes of images in Figure 13. Each column represents a different latent space value (1)(2)(3)(4)(5)(6)(7)(8)(9)(10)(11)(12) in descending order.…”
Section: Exploring the Latent Spacementioning
confidence: 99%
“…We begin by showing a linear interpolation of QCD and W boson jets in Figures 12 and 14 and the difference between the two classes of images in Figure 13. Each column represents a different latent space value (1)(2)(3)(4)(5)(6)(7)(8)(9)(10)(11)(12) in descending order.…”
Section: Exploring the Latent Spacementioning
confidence: 99%
“…The assumed prior can drag z into unrealistic regions of parameter space far from the location preferred by the likelihood, where the corresponding decoded image µ d (z) does not look like a galaxy. A variety of methods have been proposed to address this problem, including constructing simple priors using q φ (z) [83,94], fitting q φ (z) with a second VAE after training the primary one [93], or using normalizing flows [95][96][97][98]. Here we follow the simpler approach of creating a weakly-informative prior for z by fitting a multivariate normal distribution to the set of encoded means of the training data {µ e x (i) } N i=1 and rescaling its covariance matrix by a factor of 9.…”
Section: A Variational Autoencodersmentioning
confidence: 99%
“…For example, the usage of GANs to only perform detector/calorimeter simulation was explored in [8][9][10][11][12][13][14][15][16][17][18][19][20][21]. The use of GANs for simulating the underlying hard process was considered in [22][23][24], while the use of GANs for pileup description was explored in [25,26]. Recent works have also explored the idea of replacing the entire reconstructed-event generation pipeline with a GAN [21,22,25,[27][28][29][30].…”
Section: Introductionmentioning
confidence: 99%
“…The use of GANs for simulating the underlying hard process was considered in [22][23][24], while the use of GANs for pileup description was explored in [25,26]. Recent works have also explored the idea of replacing the entire reconstructed-event generation pipeline with a GAN [21,22,25,[27][28][29][30]. Each of these applications of GANs would differ in the nature of the training data used.…”
Section: Introductionmentioning
confidence: 99%