SciPost logo

SciPost Submission Page

The NFLikelihood: an unsupervised DNNLikelihood from Normalizing Flows

by Humberto Reyes-Gonzalez, Riccardo Torre

This Submission thread is now published as

Submission summary

Authors (as registered SciPost users): Humberto Reyes-González
Submission information
Preprint Link: https://arxiv.org/abs/2309.09743v3  (pdf)
Code repository: https://github.com/NF4HEP/NFLikelihoods
Data repository: https://zenodo.org/records/8349144
Date accepted: 2024-07-15
Date submitted: 2024-05-23 09:59
Submitted by: Reyes-González, Humberto
Submitted to: SciPost Physics
Ontological classification
Academic field: Physics
Specialties:
  • High-Energy Physics - Experiment
  • High-Energy Physics - Phenomenology
Approaches: Experimental, Computational, Phenomenological

Abstract

We propose the NFLikelihood, an unsupervised version, based on Normalizing Flows, of the DNNLikelihood proposed in Ref.[1]. We show, through realistic examples, how Autoregressive Flows, based on affine and rational quadratic spline bijectors, are able to learn complicated high-dimensional Likelihoods arising in High Energy Physics (HEP) analyses. We focus on a toy LHC analysis example already considered in the literature and on two Effective Field Theory fits of flavor and electroweak observables, whose samples have been obtained throught the HEPFit code. We discuss advantages and disadvantages of the unsupervised approach with respect to the supervised one and discuss possible interplays of the two.

Author indications on fulfilling journal expectations

  • Provide a novel and synergetic link between different research areas.
  • Open a new pathway in an existing or a new research direction, with clear potential for multi-pronged follow-up work
  • Detail a groundbreaking theoretical/experimental/computational discovery
  • Present a breakthrough on a previously-identified and long-standing research stumbling block

List of changes

-Minor clarification about Sliced Wasserstein Distance computation procedure.
-Minor clarification about further hyper parameter tuning.
- Fixed typo Table 3 -> Table 6.

Published as SciPost Phys. Core 7, 048 (2024)


Reports on this Submission

Report #1 by Anonymous (Referee 3) on 2024-6-7 (Invited Report)

  • Cite as: Anonymous, Report on arXiv:2309.09743v3, delivered 2024-06-07, doi: 10.21468/SciPost.Report.9205

Strengths

1- Application of well-studied machine learning (ML) strategy to the novel problem of likelihood estimation (strictly speaking, posterior estimation)

2- Case studies on two realistic examples at the Large Hadron Collider (LHC).

Weaknesses

1- No real innovation on the ML side, just a direct application of normalizing flows.

2- Ultimate version of this method is likely to use conditional normalizing flows, which is not explored in this paper

Report

I was asked by the Editor to evaluate this manuscript in the context of suitability for SciPost Physics versus SciPost Physics Core.

First, let me concur with the previous referee that this paper presents a promising new approach towards an ML implementation of likelihoods relevant for the LHC. In previous work by one of the authors on DNNLikelihood, it was shown that one can use ML regression strategies to interpolate the likelihood function from a set of sampled points taken from the true likelihood. Here, the authors introduce NFLikelihood, which uses normalizing flows to learn the likelihood function (strictly speaking the posterior for a given prior). The authors discuss the relative strengths of the two approaches, where DNNLikelihood is typically better suited for frequentist analyses where the peaks of the likelihood are most important to model, while NFLikelihood is typically better suited for Bayesian analyses, where it is important to model the entire posterior. In addition to using a toy LHC analysis to benchmark against DNNLikelihood, the authors apply NFLikelihood to two realistic studies related to electroweak and flavor physics. These are high dimensional problems (the smallest example is 40 parameters), so the strong performance of NFLikelihood is very promising, especially given its reduced computational costs.

At some level, all this paper is doing is using normalizing flows for density estimation, which is a well studied problem. That said, this paper is performing density estimation in a context of high relevance to the LHC, where archiving likelihoods in a computational efficient form is an important part of making sure LHC analyses can be reused in the context of global fits. If this paper only repeated the toy LHC analysis from the DNNLikelihood paper, I would probably recommend publication in SciPost Physics Core. But since the authors both introduce a novel machine learning strategy and apply it to two realistic problems of interest to the LHC community, I think this paper meets the standards of SciPost Physics.

Requested changes

None

Recommendation

Publish (meets expectations and criteria for this Journal)

  • validity: top
  • significance: high
  • originality: good
  • clarity: high
  • formatting: good
  • grammar: excellent

Login to report or comment