https://doi.org/10.1140/epjc/s10052-025-13785-y
Regular Article - Experimental Physics
Tagging more quark jet flavours at FCC-ee at 91 GeV with a transformer-based neural network
1
Inter-university Institute for High Energies, Vrije Universiteit Brussel, 1050, Brussels, Belgium
2
Deutsches Elektronen-Synchrotron DESY, Notkestr. 85, 22607, Hamburg, Germany
3
Universität Zürich, Winterthurerstr. 190, 8057, Zurich, Switzerland
4
Universität Hamburg, Luruper Chaussee 149, 22761, Hamburg, Germany
Received:
15
July
2024
Accepted:
2
January
2025
Published online:
10
February
2025
Jet flavour tagging is crucial in experimental high-energy physics. A tagging algorithm, DeepJet- Transformer, is presented, which exploits a transformer-based neural network that is substantially faster to train than state-of-the-art graph neural networks. The DeepJetTransformer algorithm uses information from particle flow-style objects and secondary vertex reconstruction for b- and c-jet identification, supplemented by additional information that is not always included in tagging algorithms at the LHC, such as reconstructed and
and
discrimination. The model is trained as a multiclassifier to identify all quark flavours separately and performs excellently in identifying b- and c-jets. An s-tagging efficiency of
can be achieved with a
ud-jet background efficiency. The performance improvement achieved by including
and
reconstruction and
discrimination is presented. The algorithm is applied on exclusive
samples to examine the physics potential and is shown to isolate
events. Assuming all non-
backgrounds can be efficiently rejected, a
discovery significance for
can be achieved with an integrated luminosity of
of
collisions at
, corresponding to less than a second of the FCC-ee run plan at the Z boson resonance.
© The Author(s) 2025
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Funded by SCOAP3.