SciPost logo

SciPost Submission Page

Reduce, Reuse, Reinterpret: an end-to-end pipeline for recycling particle physics results

by Giordon Stark, Camila Aristimuno Ots, Michael Hance

This is not the latest submitted version.

Submission summary

Authors (as registered SciPost users): Michael Hance · Giordon Stark
Submission information
Preprint Link: https://arxiv.org/abs/2306.11055v1  (pdf)
Code repository: https://github.com/scipp-atlas/mapyde
Data repository: https://github.com/scipp-atlas/mapyde-paper-data
Date submitted: 2023-06-21 15:41
Submitted by: Hance, Michael
Submitted to: SciPost Physics
Ontological classification
Academic field: Physics
Specialties:
  • High-Energy Physics - Experiment
  • High-Energy Physics - Phenomenology
Approaches: Experimental, Phenomenological

Abstract

Searches for new physics at the Large Hadron Collider have constrained many models of physics beyond the Standard Model. Many searches also provide resources that allow them to be reinterpreted in the context of other models. We describe a reinterpretation pipeline that examines previously untested models of new physics using supplementary information from ATLAS Supersymmetry (SUSY) searches in a way that provides accurate constraints even for models that differ meaningfully from the benchmark models of the original analysis. The public analysis information, such as public analysis routines and serialized probability models, is combined with common event generation and simulation toolkits MadGraph, Pythia8, and Delphes into workflows steered by TOML configuration files, and bundled into the mapyde python package. The use of mapyde is demonstrated by constraining previously untested SUSY models with compressed sleptons and electroweakinos using ATLAS results.

Current status:
Has been resubmitted

Reports on this Submission

Report #3 by Anonymous (Referee 3) on 2023-10-30 (Invited Report)

  • Cite as: Anonymous, Report on arXiv:2306.11055v1, delivered 2023-10-30, doi: 10.21468/SciPost.Report.8021

Report

The authors describe in this article the so-called mapyde tool chain. This tool aims at facilitating the reinterpretation of existing analyses by the LHC collaborations. This looks like a rather interesting and easy to use tool which in principle warrants publication. However, there is one aspect which is not entirely clear: the authors present their tool for the use of one analysis only but what make simplified model analyses really interesting is that they can be combined in principle to investigated more realistic models. Thus, can one use this tool for the combination of different analyses? If yes, the authors should briefly discuss how their tool can be used to get constraints on a model from the combination of different analyses. This could be done in an additional appendix.

Moreover, there are a few more minor things that should be clarified.
1- page 4, 1st paragraph of section 3.2: It should be mentioned that the NLO corrections used by the ATLAS collaboration are QCD corrections but do not include electroweak corrections.

2- page 4, 2nd paragraph of section 3.2: the authors mention that they need to adjust the lepton efficiencies in DELPHES to reproduce the ATLAS signal yields. They state that the values used correspond to the upper range of their ref.[41]. It would be helpful if these numbers are collected
in a short appendix so that the paper is self-contained.

3- page 4, 3rd paragraph of section 3.2: the authors state, that they consider a scenario in which a slepton decays into a wino-like neutralino assuming that the
corresponding branching ratio is 100 per-cent. This assumption is unrealistic as one would also expect a wino-like chargino with similar mass as the neutralino in such case. This would also contribute to their signal albeit with different kinematics. While the assumption is fine for the purpose of the demonstration that the authors have mind, they should mention either in the text or as footnote that
this assumption is not realistic as outlined in the comment here.

4- page 9, fig.8.: from the figure one could the impression that the authors intend to study decays with an on-shell W-boson while according to the discussion in the text it has to be an off-shell W-boson. I propose that the authors replace `W-boson' in the caption by `(off-shell) W-boson' to avoid any confusions.

5- figure 9 and the corresponding discussion: I do not fully understand what is shown here nor the caption.
a) - the authors consider also scenarios where both charginos are presented as indicated in fig.(8). In this case there will be four neutralinos present and not only three as claimed in the caption.
b) - What is the precise meaning of `fraction of models excluded'? What do we learn from this number given the fact that each bin contains only a few points and it is not clear how representative these points are.
c) Why do the authors give only the line for the higgsino case and not also the one for the bino/wino case of their ref.~[41]? In their scan both cases are possible and, thus, both lines should be shown.

Requested changes

1- please add a discussion on how to combine different analyses to constrain a model

2 - please add the lepton efficiencies used for DELPHES, see comment 2 above

3- figure 9, please add the line for the exclusion of the wino-bino scenario from ref [41]

  • validity: -
  • significance: -
  • originality: -
  • clarity: -
  • formatting: -
  • grammar: -

Report #2 by Anonymous (Referee 2) on 2023-10-27 (Invited Report)

  • Cite as: Anonymous, Report on arXiv:2306.11055v1, delivered 2023-10-26, doi: 10.21468/SciPost.Report.8003

Report

The paper "Reduce, Reuse, Reinterpret: an End-to-End Pipeline for Recycling Particle Physics Results" by G. Stark, C. Ots and M. Hance presents the MaPyDe toolkit for reinterpretation of analyses based on the SimpleAnalysis framework for analysis description. The work is interesting and the tool is clearly a useful addition to the already existing reinterpretation tools. Physics applications of the toolkit to a few BSM scenarios are also briefly discussed.
The paper, however, does not present enough groundbreaking physics results.
In particular, similar scenarios have been investigated in other pMSSM studies and the single ATLAS analysis considered in the work is also available in other reinterpretation tools (such as CheckMATE and SModelS).
Therefore I strongly recommend it to be submitted to SciPost Physics Codebases with some minor modifications.
Below I list some of changes which I think would improve its readability and usefulness for MaPyDe users.

Requested changes

In order to make the draft more suitable for SciPost Physics Codebases, I recommend the following changes:
1. I suggest to move the information presented in Appendix B to the main body of the text, since it is an essential information for running the toolkit.
2. It would also be desirable to include a short section with installation instructions. Although this can be found it the project page, it could be included for completeness.
3. Finally, more details about the running options should be provided.

In addition, the following points should be addressed:
a. The url for the repository containing the paper data (Ref.[42]) is wrong: "-atlas" is missing.
b. In Figure 1: the SLHA input is shown as coming from the "Analysis Paper+HEPData" box. Why is it not part of the basic MadGraph input (which usually requires a process, parameter and run cards)?
c. For completeness Figure 4 should include the value of slepton mass used as well as the value of $\Delta m(\tilde l,\tilde \chi_1^0)$.
d. The link to the containers (https://github.com/scipp-atlas/container_registry) seems to be broken.
e. Both in Figure 2 and 7 it would be desirable to show the ratio between the cross-section upper limits obtained from MaPyDe and the official ATLAS ones across the mass plane. This provides a more robust validation for a wide range of masses, since it goes beyond the small region of parameter space around the exclusion curve.
f. "Listing A" is mentioned at the top of page 10, but it should probably read "Listing 1".
g. It is very briefly mentioned that dark matter and the muon $g-2$ constraints are imposed. But more details should be provided. In particular, which range of $g-2$ values was considered? Also, which dark matter constraints were imposed?

  • validity: high
  • significance: good
  • originality: good
  • clarity: high
  • formatting: excellent
  • grammar: perfect

Report #1 by Anonymous (Referee 1) on 2023-10-9 (Contributed Report)

  • Cite as: Anonymous, Report on arXiv:2306.11055v1, delivered 2023-10-09, doi: 10.21468/SciPost.Report.7922

Strengths

1-Clear and well written
2-Example use case clearly illustrate the package
3-Code and documentation included
4-Grammatical error free

Weaknesses

1-Not really suitable for scipost physics

Report

This manuscript introduces a Python package "mapyde", that combines existing toolkits into a workflow for 'reusing' past analysis on new models. Package usage is demonstrated through an SUSY example using previous ATLAST results.

The source code is included in a separate Github repository. The paper is very well-written and easy to follow. There is no grammatical errors. The example included clearly demonstrates the power of the proposed pipeline. I do not have any comment here.

Since the included analysis is not new, and the individual packages have been used extensively in the literature, I think this addition would be perfect for publication under SciPost Physics Codebases rather than SciPost Physics.

Requested changes

1-"... In either case, ... are only strictly .... models only ...": delete redundant "only"
2-"python" and "github" should be "Python" and "GitHub"

  • validity: top
  • significance: high
  • originality: high
  • clarity: top
  • formatting: perfect
  • grammar: perfect

Login to report or comment