SciPost logo

Attention to the strengths of physical interactions: Transformer and graph-based event classification for particle physics experiments

Luc Builtjes, Sascha Caron, Polina Moskvitina, Clara Nellist, Roberto Ruiz de Austri, Rob Verheyen, Zhongyi Zhang

SciPost Phys. 19, 028 (2025) · published 28 July 2025

Abstract

A major task in particle physics is the measurement of rare signal processes. Even modest improvements in background rejection, at a fixed signal efficiency, can significantly enhance the measurement sensitivity. Building on prior research by others that incorporated physical symmetries into neural networks, this work extends those ideas to include additional physics-motivated features. Specifically, we introduce energy-dependent particle interaction strengths, derived from leading-order SM predictions, into modern deep learning architectures, including Transformer Architectures (Particle Transformer), and Graph Neural Networks (Particle Net). These interaction strengths, represented as the SM interaction matrix, are incorporated into the attention matrix (transformers) and edges (graphs). Our results in event classification show that the integration of all physics-motivated features improves background rejection by $10\%-40\%$ over baseline models, with an additional gain of up to $9\%$ due to the SM interaction matrix. This study also provides one of the broadest comparisons of event classifiers to date, demonstrating how various architectures perform across this task. A simplified statistical analysis demonstrates that these enhanced architectures yield significant improvements in signal significance compared to a graph network baseline.

Cited by 1

Crossref Cited-by

Authors / Affiliations: mappings to Contributors and Organizations

See all Organizations.
Funders for the research work leading to this publication