2002
DOI: 10.1088/1126-6708/2002/05/062
|View full text |Cite
|
Sign up to set email alerts
|

Neural network parametrization of deep-inelastic structure functions

Abstract: We construct a parametrization of deep-inelastic structure functions which retains information on experimental errors and correlations, and which does not introduce any theoretical bias while interpolating between existing data points. We generate a Monte Carlo sample of pseudo-data configurations and we train an ensemble of neural networks on them. This effectively provides us with a probability measure in the space of structure functions, within the whole kinematic region where data are available. This measu… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

9
330
0

Year Published

2007
2007
2021
2021

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 192 publications
(339 citation statements)
references
References 18 publications
(16 reference statements)
9
330
0
Order By: Relevance
“…[9]. The basic idea is to combine a Monte Carlo sampling of the probability measure on the space of functions that one is trying to determine (as in the approach of ref.…”
Section: Jhep03(2007)039mentioning
confidence: 99%
See 4 more Smart Citations
“…[9]. The basic idea is to combine a Monte Carlo sampling of the probability measure on the space of functions that one is trying to determine (as in the approach of ref.…”
Section: Jhep03(2007)039mentioning
confidence: 99%
“…[9,10] this strategy was tried off on a somewhat simpler problem, namely, the construction of a parametrization of existing data on the deep-inelastic structure function F 2 (x, Q 2 ) of the proton and neutron. In such case, one is only testing that the method can be used to construct a faithful representation of the probability density in a space of functions, based on the measurement of the function at a finite discrete number of points.…”
Section: Jhep03(2007)039mentioning
confidence: 99%
See 3 more Smart Citations