Tackling large data sets and many parameter problems in particle physics
A new paper published in EPJ Plus, authored by Ursula Laa from the Institute of Statistics at BOKU University, Vienna, and German Valencia from the School of Physics and Astronomy, Monash University, Clayton, Australia, looks at the simplification of large data set and many parameter problems using tools to split large parameter spaces into a small number of regions.
“We applied our tools to the so-called B-anomaly problem. In this problem there is a large number of experimental results and a theory that predicts them in terms of several parameters,” Laa says. “The problem has received much attention because the preferred parameters to explain the observations do not correspond to those predicted by the standard model of particle physics, and as such the results would imply new physics.”
Valencia continues by explaining the paper shows how the Pandemonium tool can provide an interactive graphical way to study the connections between characteristics in the observations and regions of parameter space.
“In the B-anomaly problem, for example, we can clearly visualise the tension between two important observables that have been singled out in the past,” Valencia says. “We can also see which improved measurements would be best to address that tension.
“This can be most helpful in prioritising future experiments to address unresolved questions.”
Laa elaborates by explaining that the methods developed and used by the duo are applicable to many other problems, in particular for models and observables that are less well understood than the applications discussed in the paper, such as multi Higgs models.
“A challenge is the visualization of multidimensional parameter spaces, the current interface only allows the user to visualise high dimensional data spaces interactively,” Laa concludes. “The challenge is to automate this, which will be addressed in future work, using techniques from dimension reduction.”