~ Greg Moore
Scientists believe that in the first microseconds after the Big Bang, the universe was in a hot, dense state called the quark gluon plasma (QGP). As the universe expanded, this fluid-like plasma became the particles (protons and neutrons) that make up our known universe today. Jonah Bernhard is a third-year Ph.D. candidate in the Duke University physics department, where he works with Professor Steffen A. Bass and his group studying the formation and properties of the QGP produced in collisions of heavy nuclei. Bernhard is lead on a project — best described as “on the border of nuclear physics and computer science” — that uses the Open Science Grid (OSG).
“In a particle accelerator,” says Bernhard, “we collide heavy nuclei in order to create droplets of fluid similar to the state of the early universe. This primordial fluid expands rapidly and cools before we observe it in our particle detectors.” These are the most extreme conditions ever produced on earth in terms of heat and density, according to Bernhard. The droplets are about 100,000 times hotter than the core of the sun. The problem for Bernhard and his colleagues is that the collisions are very short and QGP volume extremely small, so they can only observe the “ashes” of its decay in the form of particles called hadrons which are measured using big detectors that observe the collisions. The detectors then store the data on computers.
In the 1980s and 1990s, scientists first tried to create the QGP at CERN’s Super Proton Synchrotron. Current experiments are conducted at Brookhaven’s Relativistic Heavy Ion Collider (RHIC) and the Large Hadron Collider at CERN. The first solid evidence for the QGP was observed at the RHIC. “We can’t observe the QGP directly,” said Bernhard. “If we want to draw conclusions, we must use computational models. We match our computer models to observations of the experiments.” If they can get the model and the observations to match, they have a good idea of what is actually taking place. However, “Computer models are very complicated and computationally expensive. They can take a few hours to simulate one observation — one collision.”
Bernhard refers to this as a model-to-data comparison. “We have a complex computer model which simulates a physical process and must be calibrated to experimental results. By calibration, I mean tuning a set of model parameters so that the model optimally matches data,” he explains. “Presumably, some of the model parameters are real physical parameters to be measured. In my case, I am tuning a model of relativistic nuclear collisions to match data from actual collisions conducted in particle accelerators.”
Figure 1: Time-evolution of a heavy ion collision. Nuclei approach each other at the speed of light and collide, creating new matter in the process. The new matter expands and cools as a fluid, eventually freezing into particles. The particles continue to expand as a gas. Time scale is 10-20 fm/c. (1 fm/c is the time it takes light to travel – 1 fm = 10^-15 meters, so 1 fm/c = 3.3e-24 seconds.) The entire collision is perhaps 10^-23 seconds, orders of magnitude below what can be measured directly. Picture credit: Jonah Bernhard.
Bernhard and his colleagues are interested in the properties of the QGP created in the collisions. To measure these properties, they are performing model-to-data comparison on a much larger scale, and more systematically and rigorously, than ever before. Each event takes roughly one hour to run. Bernhard has five parameters that he is tuning, and he needs about 10,000 events for each point in parameter space. “Now suppose I want to calculate 200 points in parameter space,” notes Bernhard. “The CPU-time requirements quickly become very large.” Under optimal conditions, he routinely runs an unprecedented 500,000 CPU hours a week (equivalent to 57 years) with a success rate of around 99.9%.
That’s where the OSG comes into play. Professor Bass had a general idea for the project, and when Bernhard joined the group, his computer experience made him the clear person to pursue it. To interface with the OSG, Bernhard wrote his own scripts, which select a collision model that runs on a massive scale. No standard tools existed to automate this step, so he wrote and optimized the utilities which automatically run the models on the grid. He then uses standard Condor (workload software) to manage the jobs on the grid. “I had previous experience, but I’m not the first person to do this,” clarified Bernhard. “I used some existing code and physics models written by others and didn’t have to start from scratch.”
Everything runs via one command that generates and submits the jobs. The OSG completes the models then copies the output back to storage space at Duke a few hours later. “This project is perfectly suited for OSG,” says Bernhard, “because we can run many jobs independently of each other. When the simulations of millions of independent events don’t depend on each other, it’s easy to scale things up to the grid. And its sheer availability is invaluable.”
“With other systems, you might have to write a grant application to get CPU time, and who knows how long it’s going to take,” he added. “In the past, scientists were limited by CPU time and couldn’t be as systematic. They tuned things by hand. Now with the OSG, we can extract rigorous constraints on the quantities we are looking for.”
Using statistical analysis, Bernhard is able to extract rigorous estimates of each parameter, including errors. (He also notes that these ideas are easily transferable to other disciplines that require model-to-data comparison.) “We are studying conditions of the first microseconds of the universe and recreating the conditions right after the Big Bang in the lab, using computer models to characterize it,” said Bernhard. “We need the experiments, but computer models are essential to extract meaning from the measurements.”
Find Jonah Bernhard’s utilities and models for running on the OSG at his github repository.