https://doi.org/10.1140/epjc/s10052-025-14200-2
Regular Article - Theoretical Physics
Machine learning electroweakino production
1
Laboratoire de Physique Subatomique et de Cosmologie (LPSC), Université Grenoble-Alpes, 53 Avenue des Martyrs, 38026, Grenoble, France
2
Institute of Theoretical Physics, Faculty of Physics, University of Warsaw, Pasteura 5, 02-093, Warsaw, Poland
3
Theory Center, IPNS, KEK, 1-1 Oho, 305-0801, Tsukuba, Ibaraki, Japan
4
The Graduate University of Advanced Studies (Sokendai), 1-1 Oho, 305-0801, Tsukuba, Ibaraki, Japan
5
Kavli IPMU (WPI), University of Tokyo, 5-1-5 Kashiwanoha, 277-8583, Kashiwa, Chiba, Japan
Received:
7
February
2025
Accepted:
21
April
2025
Published online:
2
June
2025
The system of light electroweakinos and heavy squarks gives rise to one of the most challenging signatures to detect at the LHC. It consists of missing transverse energy recoiled against a few hadronic jets originating either from QCD radiation or squark decays. The analysis generally suffers from the large irreducible (
) background. In this study, we explore Machine Learning (ML) methods for efficient signal/background discrimination. Our best attempt uses both reconstructed (jets, missing transverse energy, etc.) and low-level (particle-flow) objects. We find that the discrimination performance improves as the
threshold for soft particles is lowered from 10 to 1 GeV, at the expense of larger systematic uncertainty. In many cases, the ML method provides a factor two enhancement in
from a simple kinematical selection. The sensitivity on the squark-elecroweakino mass plane is derived with this method, assuming the Run-3 and HL-LHC luminosities. Moreover, we investigate the relations between input features and the network’s classification performance to reveal the physical information used in the background/signal discrimination process.
© The Author(s) 2025
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Funded by SCOAP3.