IceCube is providing a unique view into the universe

IceCube is a neutrino detector at the South Pole that uses 5,160 light sensors to detect the tiny light flashes that result when neutrinos interact with ice. Neutrinos are hard-to-detect subatomic particles with nearly no mass that are created by the sun, radioactive decay, cosmic rays, and violent events such as exploding stars. The detector—sometimes referred to as a telescope because it can tell where the neutrinos are coming from—has sensors to take precise measurements of energy and direction.

A large mass of water or ice is ideal for detecting neutrinos, so in 1999, the originators of IceCube submitted their proposal to build at the South Pole. Taking advantage of the large volume of ice there, IceCube is the world’s largest neutrino detector, encompassing a cubic kilometer of ice. The ice is also exceptionally clear, making it even more ideal. Construction started in 2004 and finished in the 2010-11 austral summer season. The collaboration now includes 300 people from 44 institutions in 12 countries.

In some ways, the computing challenges of IceCube are unique. The sensors start at a depth of 1.5 kilometers and go to a depth of 2.5 kilometers in the ice. They generate about one terabyte of data every day, 365 days a year. That amount is too much for the project’s satellite hookup, which allows about 100 gigabytes per day—only about one-tenth of the daily data. The project has 50 servers on-site in a small data center where the data is filtered down to the most interesting 100 gigabytes, which is then sent to the University of Wisconsin–Madison (UW–Madison) via this satellite connection.


gonzalo credit Silvia Bravo
Gonzalo Merino
Image Credit:  Silvia Bravo
Gonzalo Merino is the IceCube computing facilities manager at UW-Madison. He has a staff of around 10 people who can start post-processing the data and begin analysis without much lag. All the raw data are archived at the pole, which the project team physically retrieves once a year for long-term archiving when team members rotate in and out during the austral summer season.

Merino says that when they start the processing pipeline at UW-Madison and produce data products, data can be expanded to three- or four-times the original, filtered size. Another big part of the computing challenge is all the simulation needed. “Much of our analyses rely on simulations which we then compare to the actual data,” says Merino. “The Monte Carlo simulation data piles up to more than a half petabyte every year.” [Editor’s note: The term “Monte Carlo” comes from the gambling method of playing and recording results]. “IceCube, in terms of core computing resources, includes the main data center, Tier-0, here at UW-Madison, where the data are received from the South Pole. We currently warehouse about 3.5 petabytes at the Tier-0—all the accumulated data since the beginning of the detector. There is also a Tier-1 center in Germany, DESY-Zeuthen, where the processed data products are replicated.” UW-Madison also provides data access and processing services for collaborators all over the world. Most are in the U.S. and Europe, and some are in Asia and Australia.

A year ago, Merino spent one month at the project site and got to know the data center. “Since IceCube was completed, the experiment is making great findings, including the discovery of astrophysical neutrinos,” adds Merino. “My previous experience was in accelerator high-energy physics. I’m excited to work on the computing for IceCube and be part of all this amazing science.”

 


IMG_0562_750[1]
 

 

IMG_0602_750[1]

Images Credit:  Gonzalo Merino

 

The number of required simulations is very large. Merino says they can always use more Monte Carlo simulations. In this case, the universe is the particle accelerator. “The detector looks simple,” he says. “But we need lots of computing power to model the very complex physics processes that take place when IceCube detects cosmic particles. We have a lot in common with the Large Hadron Collider and other particle physics research in that respect. What we do is very similar to those large experiments except that ours is smaller, so we have fewer people and fewer resources. That is also a challenge. Being well connected to other collaborations is very important because we can learn from one another.”

“We have great resources here at UW-Madison,” notes Merino. “Using HTCondor, we just have to submit a job that will try to run on UW resources first but then will automatically run on other Open Science Grid resources if more resources are needed. The nice part is that this all happens transparently for users. When we look at what resources we are using, we often see the jobs running elsewhere.”

Merino is happy to belong to the Open Science Grid (OSG) and use the tools that have been developed. He says that one thing they are doing in IceCube that could be brought to bear elsewhere is using graphics processing units (GPUs) alongside central processing units (CPUs). “For some workloads, GPUs are critically important,” he says. “We need good modeling of how millions of photons propagate through the ice and then to be able to understand the data coming from the sensors.”

Tracking photons in the ice was always one of the big computing challenges. They had to work with very large files that had a parameterized model of the properties of the ice. To get the potential out of the detector, they needed better resolution of how they were simulating the way the light propagates in the ice. “GPUs turned out to be a perfect fit,” he says. “In the end, light propagation is pretty much what is done in video games, so we started using GPUs, and it worked. GPUs are doing a good job in photon simulation—up to 300 times faster than CPUs. Using GPUs has since become a core part of our process. We had to learn how to run a large cluster of them, a pretty steep learning curve in terms of system administration. We also have developed middleware that orchestrates all these workflows.” Part of the workload requires GPUs, other parts use CPUs. “Right now, all of the GPUs used for simulation belong to or are allocated to IceCube,” notes Merino. “We are looking for ways to use opportunistic GPU resources on the OSG.”

“IceCube is looking at the universe with a different set of glasses and opening a new door that no one looked at before,” says Merino. “The IceCube telescope is a powerful tool to search for dark matter, and could reveal new physical processes at work in the origin of the universe.”

Last year, after sifting through their data, the project discovered 28 neutrinos that quite likely came from a source outside our galaxy. (See Evidence for High-Energy Extraterrestrial Neutrinos at the IceCube Detector.) This year, the project reported in a new paper more results from three years’ worth of data.  You can follow the work of the South Pole team here.

~ Greg Moore