2023
DOI: 10.21468/scipostphys.15.4.141
|View full text |Cite
|
Sign up to set email alerts
|

MadNIS - Neural multi-channel importance sampling

Theo Heimel,
Ramon Winterhalder,
Anja Butter
et al.

Abstract: Theory predictions for the LHC require precise numerical phase-space integration and generation of unweighted events. We combine machine-learned multi-channel weights with a normalizing flow for importance sampling, to improve classical methods for numerical integration. We develop an efficient bi-directional setup based on an invertible network, combining online and buffered training for potentially expensive integrands. We illustrate our method for the Drell-Yan process with an additional narrow resonance.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 53 publications
0
0
0
Order By: Relevance
“…The third network we analyze using learned classifier weights generates events for the process pp → (Z → µ + µ − ) + 1, 2, 3 jets, (7) at the reconstruction level, using the precision INN architecture described in detail in Ref. [18].…”
Section: Event Generationmentioning
confidence: 99%
See 1 more Smart Citation
“…The third network we analyze using learned classifier weights generates events for the process pp → (Z → µ + µ − ) + 1, 2, 3 jets, (7) at the reconstruction level, using the precision INN architecture described in detail in Ref. [18].…”
Section: Event Generationmentioning
confidence: 99%
“…The range of tasks for generative networks in LHC simulations and analysis is extensive. Given the modular structure of LHC simulations, it starts with phase space integration and sampling [2][3][4][5][6][7], for instance of ML-encoded transition amplitudes. More LHC-specific tasks include event subtraction [8], event unweighting [9,10], or super-resolution enhancement [11,12].…”
Section: Introductionmentioning
confidence: 99%
“…With the development of modern machine learning methods, new techniques for adaptive Monte-Carlo integration have emerged, which are based on the extension [42,43] of a nonlinear independent components estimation technique [44,45], also known as a normalizing flow. They have been used to develop integration algorithms based on existing multi-channel approaches [19,21,22,25]. One of the main obstacles to scaling such approaches to high multiplicity has been the fact that the underlying phase-space mappings are used as individual mappings in a multi-channel phase-space generator.…”
Section: Combination With Normalizing-flow Based Integratorsmentioning
confidence: 99%
“…Adaptive Monte-Carlo methods [12][13][14][15][16][17] are therefore used by most theoretical calculations and event generators to map out structures of the integrand which are difficult to predict. More recently, neural networks have emerged as a promising tool for this particular task [18][19][20][21][22][23][24][25].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation