SciPost Submission Page
Casting a graph net to catch dark showers
by Elias Bernreuther, Thorben Finke, Felix Kahlhoefer, Michael Krämer, Alexander Mück
|As Contributors:||Elias Bernreuther · Thorben Finke · Michael Krämer|
|Arxiv Link:||https://arxiv.org/abs/2006.08639v1 (pdf)|
|Date submitted:||2020-06-23 02:00|
|Submitted by:||Bernreuther, Elias|
|Submitted to:||SciPost Physics|
|Subject area:||High-Energy Physics - Phenomenology|
Strongly interacting dark sectors predict novel LHC signatures such as semi-visible jets resulting from dark showers that contain both stable and unstable dark mesons. Distinguishing such semi-visible jets from large QCD backgrounds is difficult and constitutes an exciting challenge for jet classification. In this article we explore the potential of supervised deep neural networks to identify semi-visible jets. We show that dynamic graph convolutional neural networks operating on so-called particle clouds outperform convolutional neural networks analysing jet images as well as other neural networks based on Lorentz vectors. We investigate how the performance depends on the properties of the dark shower and discuss training on mixed samples as a strategy to reduce model dependence. By modifying an existing mono-jet analysis we show that LHC sensitivity to dark sectors can be enhanced by more than an order of magnitude by using the dynamic graph network as a dark shower tagger.
Submission & Refereeing History
You are currently on this page
Reports on this Submission
Anonymous Report 3 on 2020-7-21 Invited Report
This paper demonstrates the application of graph networks applied to the jets of dark showers at the LHC. The work is timely, the analysis is thorough, and the paper is well-written. I congratulate the authors on this nice addition to the literature! Below are a small number of minor comments that it would be good to address before the paper is published.
- [36,48,49] block on p5, might be best to add 1902.09914.
- p5: "dense neural network" is Tensorflow jargon - perhaps "fully connected network" would be more appropriate.
- Discussion on p6, before Sec. 3.1: I'm not sure how this is different than a CNN; a CNN operator, just like the EdgeConv operator is local, but having many layers allows for global relations. Please either tell me why this is wrong or modify the text accordingly.
- "Normally, the network prediction is given by the class with the highest probability." -> I would never advocate to do this, since the best cut should depend on the relative abundance of signal and background which most certainly is not 50%-50% (=the prior used in your training).
- Fig. 3 right: can you please comment how this compares with the results shown in the community top tagging paper (1902.09914)?
- Why R = 0.8? That is not the radius used by ATLAS.
- p12: Please explain 3.84 (I am sure you have a good reason, but please don't make the reader guess!).
- "We can therefore safely neglect additional systematic uncertainties introduced by the dark shower tagger." -> the stat. uncertainty is at most 30% - it does not seem crazy to me that the systematic uncertainty would be comparable. Why is it then save to neglect?
- Last paragraph of the conclusions: unsupervised methods are not the only anomaly detection procedures that have been proposed - there are a variety of semi-supervised methods that may be (more) effective. See e.g. https://iml-wg.github.io/HEPML-LivingReview/.
- Delphes: what detector setup are you using?