Loading [MathJax]/extensions/Safe.js
SciPost logo

Normalizing flows for high-dimensional detector simulations

Florian Ernst, Luigi Favaro, Claudius Krause, Tilman Plehn, David Shih

SciPost Phys. 18, 081 (2025) · published 5 March 2025

Abstract

Whenever invertible generative networks are needed for LHC physics, normalizing flows show excellent performance. In this work, we investigate their performance for fast calorimeter shower simulations with increasing phase space dimension. We use fast and expressive coupling spline transformations applied to the CaloChallenge datasets. In addition to the base flow architecture we also employ a VAE to compress the dimensionality and train a generative network in the latent space. We evaluate our networks on several metrics, including high-level features, classifiers, and generation timing. Our findings demonstrate that invertible neural networks have competitive performance when compared to autoregressive flows, while being substantially faster during generation.

Supplementary Information

External links to supplemental resources; opens in a new tab.


Authors / Affiliations: mappings to Contributors and Organizations

See all Organizations.
Funders for the research work leading to this publication