Use case class 3: execution and analysis of numerical simulations
Numerical simulations play a fundamental role in the interpretation of experimental data.
The production of these data is computationally expensive and often large quantities
of data are produced, Therefore, additional structures and tools
are needed to support the optimised use of simulation data and to enable the
accessibility and analysis of these simulations through a common set of tools.
PUNCH will provide several actions that will optimise the performance of the most commonly
used codes and will moreover provide easy-to-use tools that will allow an easy publication
of the data which requires also the development of corresponding metadata standards.
Note: The links at the indidual use cases lead to internal PUNCH4NFDI pages.
Simulations of lattice QCD are one of the main users of the fastest supercomputers worldwide. The product of these simulations are the so-called gauge configurations, which need to be analysed in order to compute predictions for experiments. The software required for these simulations is complex and has to be carefully optimised for the latest HPC architectures used for the fastest supercomputers of the day. The availability of such software is a critical competitive advantage for the German LQCD community.
Dynamical simulations for heavy-ion reactions by means of transport and/or hydrodynamics approaches are crucial to learn about the properties of strongly-interacting matter. Theorist A wants to run a simulation to verify if the code matches experimental data on pion production at a certain beam energy. First, the centrality determination has to be emulated and matched to the specific experiment. In particular, if there are several experiments in the same energy range, each of these has several specifics in terms of kinematic cuts and event selection criteria. The code itself has to have a version number and should be open source available with a certain quality control (analysis suite). By HepMC plus Rivet plus proper code maintenance this can be achieved. Also the ''data'' (at least the one to reproduce plots, otherwise the amounts are too large) needs to be available. All of this (data and code) should be findable by DOI and other schemes.
Production of simulated data in a similar way, often specific to particular types of analyses, usually managed centrally by the experiments
Processing of particle-in-cell simulation data
Data is nearly useless without information about how it was obtained. Therefore, instrument response functions (IRFs) need to be stored and curated alongside with data. Generic representations and/or interfaces to such functions need to be developed as IRFs can come in various formats. For imaging, signal reconstructions, and the application of machine learning techniques based on data obtained through IRFs not only the IRFs need to be available as fast forward codes, mapping potential sky brightness distribtuions into the data space, but also the derivative and the adjoint of the IRFs are required, in order to backpropage mismatch of a model in data space back to correction in the latend variables of the sky model.
run and analyse simulations with radiative transfer of ionizing photons on multiple scales, with a variety of resolutions and source properties, to be employed in studies of IGM reionization, Lyalpha forest, GRBs, among others.
N-body simulations find application in many fields of astrophysics spanning over many orders of magnitude. Examples are the dynamics of galaxies down to the dust growth during planet formation in protoplanetry discs. As diverse as these applications are, these simulations it is often the case that the same basic code is used to simulate these diverse physics applications. However, the simulations are usually extremely time consuming so that they mostly are performed on HPC facilities, which requires the codes to be optimized for the specific facility to be used.
provide all simulated data (from various HPC) of cosmology simulations with sufficient tools and machinery for analysis to a broader international collaboration
The simulation of high-energy showers in any detector is very time consuming. new methods need to be developed.