Use case class 4 -- User Story

Use case class 4:

Spanning almost the whole of PUNCH with global analyses

Present Andrea and Reinhard:

Andrea and Reinhard are interested in analysing a revolutionary model of New Physics. They hope that their model explains the relic abundance of dark matter in the universe, present anomalies in particle physics data (e.g. the muon anomalous magnetic moment or some anomalous rates of rare decays of B mesons) and that it is not yet excluded by experiments because it slightly modifies precision predictions of the SM in the electroweak and Higgs boson sectors, and because the predicted new particles of their model should have been a discovery at the LHC. In order to analyse their model, Andrea and Reinhard thus have to use measurements from astrophysics over low-energy precision physics and hadron physics to high energy particle physics precision measurements and searches. They have to touch on both legacy experiments like LEP and running experiments like the LHC.

Present Andrea and Reinhard must piece together information of all types of different complexity and face a huge challenge both in access to the underlying data and consistent statistical interpretations for all the legacy data. Andrea and Reinhard have to do a lot of unnecessary coding, and a lot of approximation and guesswork.

Future Andrea and Reinhard:

Future Andrea and Reinhard review the literature and select the papers on which they want to base their statistical analysis through the PUNCH portal. They define the Lagrangian of their model and the parameters they wish to scan and prepare in a phenomenological code either available on the PUNCH-SDP, or upload their own code. They select the results from the list of available data on precision results from LEP and Higgs boson measurements and searches at LEP, from the muon-g-2 experiment and B-meson physics results, and from cosmological measurements, all stored as DRP’s on the PUNCH-SDP. They include these and their theoretical model into a new DRP on the platform. Andrea and Reinhard use all analyses which are directly applicable to the model, but for ATLAS and CMS, they observe that one specific search would be more sensitive to their model if a b-jet cut would be dropped. They modify the selection provided by ATLAS and CMS on the platform and re-evaluate their results. On the PUNCH-SDP, they then apply a consistent statistical model for all analyses to derive likelihoods from all selected observables and combine them. 

Andrea can use the code straightway as the code is already optimized for use at the HPC centers. Not only that but even if the HPC center offers its services on a new computer, the code will be adapted to this new machine. Now she can obtain statistically reliable results. She also does not have to care about where to store the data she wants to publish.  This adjustment is automatically done for her. She only has to choose which of her results are the ones she uses for her publication. Then the data will be stored easily identifiable by a DOI. Besides, the input setup will also be stored and made publicly available as the initial conditions and simulation parameters will be published with the data. She can choose whether she wants to directly interact with the different HPC centers or for convenience do it through PUNCH-SDP.