2020
DOI: 10.1007/jhep06(2020)114
|View full text |Cite
|
Sign up to set email alerts
|

Using neural networks for efficient evaluation of high multiplicity scattering amplitudes

Abstract: Precision theoretical predictions for high multiplicity scattering rely on the evaluation of increasingly complicated scattering amplitudes which come with an extremely high CPU cost. For state-of-the-art processes this can cause technical bottlenecks in the production of fully differential distributions. In this article we explore the possibility of using neural networks to approximate multi-variable scattering amplitudes and provide efficient inputs for Monte Carlo integration. We focus on QCD corrections to… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

2
78
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 42 publications
(80 citation statements)
references
References 42 publications
2
78
0
Order By: Relevance
“…However, our quantitative analysis of Bayesian top taggers encountered practical limitations, for instance that the jet energy scale simultaneously affects the central value and the error bar of the probabilistic output. A similar study of uncertainties just appeared for a matrix element regression task [46].…”
Section: Introductionmentioning
confidence: 73%
“…However, our quantitative analysis of Bayesian top taggers encountered practical limitations, for instance that the jet energy scale simultaneously affects the central value and the error bar of the probabilistic output. A similar study of uncertainties just appeared for a matrix element regression task [46].…”
Section: Introductionmentioning
confidence: 73%
“…Technically, we propose to use invertible networks (INNs) [11][12][13] to invert part of the LHC simulation chain. This application builds on a long list of one-directional applications of generative or similar networks to LHC simulations, including phase space integration [14,15], amplitudes [16,17], event generation [18][19][20][21][22], event subtraction [23], detector simulations [24][25][26][27][28][29][30][31][32], parton showers [33][34][35][36], or searches for physics beyond the Standard Model [37]. INNs are an alternative class of generative networks, based on normalizing flows [38][39][40][41].…”
Section: Introductionmentioning
confidence: 99%
“…These approaches, however, require learning the full phase space density, instead of just the likelihood ratio, which is significantly more complicated than the neural resampling approach presented here. It is also worth mentioning that neural networks and other machine learning techniques have been studied to improve other components of event generation, including parton density modeling [65,66], phase space generation [67][68][69][70][71], matrix element calculations [72,73], and more [74][75][76].…”
Section: Introductionmentioning
confidence: 99%