Following the growing success of generative neural networks in LHC simulations, the crucial question is how to control the networks and assign uncertainties to their event output. We show how Bayesian normalizing flow or invertible networks capture uncertainties from the training and turn them into an uncertainty on the event weight. Fundamentally, the interplay between density and uncertainty estimates indicates that these networks learn functions in analogy to parameter fits rather than binned event counts.
Cited by 3
Bieringer et al., Calomplification — the power of generative calorimeter models
J. Inst. 17, P09028 (2022) [Crossref]
Yallup et al., Exploring phase space with nested sampling
Eur. Phys. J. C 82, 678 (2022) [Crossref]
Verheyen, Event Generation and Density Estimation with Surjective Normalizing Flows
SciPost Phys. 13, 047 (2022) [Crossref]