Anders Kvellestad and fellow researchers in the GAMBIT Collaboration have recently been granted 80 million CPU hours on LUMI through the EuroHPC JU Extreme Scale Access.
This allocation will enable the researchers to carry out extensive simulations and analyses to map out particle physics theories.
Since its initial launch in 2017, GAMBIT has emerged as the top resource for conducting extensive research on parameters within Beyond-the-Standard-Model physics.
This field of study explores new theories about the physics of subatomic particles. Professor Are Raklev is heading the computational research being conducted on the Sigma2 infrastructure.
The latest discoveries from GAMBIT
GAMBIT, which stands for the Global and Modular Beyond-the-Standard-Model Inference Tool, is a project that focuses on exploring theories and models that go beyond the standard model in particle physics. The research group has recently published two important studies, made possible through their PRACE Project Access grant from 2021.
Thermal WIMPs and the scale of new physics: global fits of Dirac dark matter effective field theories is a comprehensive statistical mapping that examines the status of WIMP theories for dark matter. WIMP stands for Weakly Interacting Massive Particle, and this has traditionally been the most popular type of theory among particle physicists working on dark matter. The study concludes that if the WIMP particle has a mass lower than 100 GeV (100 times the proton mass), it probably cannot explain all the dark matter in the universe. However, there are still many possible variants of WIMP models that can explain what dark matter is.
Collider constraints on electroweakinos in the presence of a light gravitino is a study that focuses on the experiments at the Large Hadron Collider (LHC) at CERN.
The goal of the study was to map out what the current set of experimental results from the LHC can tell us about the existence of some hypothetical particles called neutralinos, charginos, and gravitinos. A class of particle physics theories known as supersymmetric theories predicts the existence of these new particles, and they may possibly appear in proton-proton collisions at the LHC.
—If these new particles exist, we might be able to detect them through faint statistical patterns in the enormous datasets from the LHC. So what we did in our study was to conduct vast amounts of particle collision simulations to precisely map out what kind of statistical patterns such particles may have left in a wide range of different datasets from the ATLAS and CMS experiments at the LHC, says Anders Kvellestad about this study.
They examined a total of over 300,000 different versions (different parameter choices) of this supersymmetric theory, and for each of these versions, they simulated 16 million proton-proton collisions. It is this large total number of simulations that is the reason they applied for PRACE resources to conduct the study.
This study concludes that today's set of LHC results can exclude many variants of supersymmetric theories, but it is still possible that there are particles that we can discover in future studies.
LUMI Supercomputer powers up for next-level particle physics exploration
With the new allocation on LUMI-C, the research group plans to conduct several comprehensive mapping studies. These studies will be more detailed both in terms of the simulations and the mapping of the theories.
The new project, "Mapping the microcosm: discoverable collider physics beyond the Standard Model", will utilise LUMI's computing power to conduct investigations of a range of promising extensions of the standard model in particle physics. The results will be used to define new search and measurement programs at the LHC experiments to maximize the potential of LHC Run 3, and to guide the development of a physics program for future colliders.
— The allocation of CPU hours on LUMI will give us the opportunity to conduct more detailed and thorough studies of various particle physics theories. This will help to get as much new knowledge as possible from the huge datasets generated by the LHC experiments, as well as to plan for future experiments in the best possible way, says Kvellestad.