Skip to navigation Skip to content
Careers | Phone Book | A - Z Index

LBNL Team Takes Software to the End of the Earth

March 2, 2005

ice cube descending dom deep 1

A string of 60 light detectors called “DOMs” (digital optical modules) are lowered deep into the Antarctic ice sheet. Photo by Mark Krasberg, IceCube, University of Wisconsin-Madison.

Developing robust, reliable data acquisition software for high-energy physics experiments is always a challenge, but developing such software for an experiment expected to run for up to 15 years while buried in the Antarctic ice poses unique problems.

But a team led by Chuck McParland of CRD’s Distributed Scientific Tools Group has risen to the occasion. The first kilometer-long string of 60 detectors recently buried near the South Pole is already recording light pulses as the experiment seeks out evidence of neutrinos.

The experiment, an international undertaking by 26 institutes, is known as IceCube as the strings of detectors will cover a cubic kilometer of ice. The detectors, which will number 4,800 once installation is completed in 4-5 years, will serve as a telescope designed to study the high-energy variety of the ghostlike subatomic particles known as neutrinos. Originating from the Milky Way and beyond and traveling to Earth virtually unobstructed, these high-energy neutrinos serve as windows back through time and should provide new insight into questions about the nature of dark matter, the origin of cosmic rays, and other cosmic issues.

Berkeley Lab researchers were responsible for the unique electronics package inside the digital optical modules (DOMs) that will enable IceCube to pick out the rare signal of a high-energy neutrino colliding with a molecule of water. A DOM is a pressurized glass sphere the size of a basketball that houses an optical sensor, called a photomultiplier tube, which can detect photons and convert them into electronic signals that scientists can analyze.

Equipped with onboard control, processing and communications hardware and software, and connected in long strings of 60 each via an electrical cable, the DOMs can detect neutrinos with energies ranging from 200 billion to one quadrillion or more electron volts. In January and February, the first IceCube cable, with its 60 DOMs, was lowered down into a hole drilled through the Antarctic ice using jets of hot water. Plans call for a total of 80 strings of DOMS to be put in place over the next five years. The Antarctic summer season, during which the weather is “mild” enough for work on IceCube to proceed, lasts only from mid-October to mid-February. After that, winter sets in and the climate is much too harsh for any outdoor work.

 While the operating environment offers a unique challenge, McParland said, “The biggest challenge is bridging the gap between the hardware side of the project and the software side.” The hardware developers needed some of the data acquisition software to test their systems, but that was just one aspect of the overall software project. “Getting the testing and the actual data acquisition components to all work together is a challenging job,” he said.

Adding to the difficulty was the push to engineer software that would support the 10- to 15-year lifespan of the experiment. “This means we had to put a lot more effort into the design of the software and use more modern programming practices than are typically used for other experiments of this sort,” McParland said. Most of the acquisition software is being designed by an LBNL-led team which includes Chris Day, Simon Patton,Akbar Mokhtarani, Artur Muratus, Keith Beattie ,Martin Stoufer, Arthur Jones, David Hays and John Jacobsen of Berkeley Lab,as well as researchers at the University of Wisconsin, Madison; Penn State University and the University of Delaware.

The software is being developed in two pieces. The first piece goes “down the hole,” loaded on each of the DOMs. The system is so designed that the software in each module can be updated and, thanks to programmable logic, the circuit boards can also be reprogrammed to change the behavior of the detectors.

“Since the first string was installed we are already seeing events – cones of light that lit up all of the modules,” McParland said.

At the top of each detector string will be a PC, where the second piece of software will be installed. Together, the PCs will form a pyramid of processors, with each performing initial filtering of the data before sending them to a satellite for transmission to what will eventually be a 120-processor farm.

The flow of data is in the shape of waveforms describing each event. Each waveform is time stamped. This allows them to be compared, with accuracy to within five nanoseconds, so that researchers can see how and when a particle passed near each detector. Calibrating the system to allow this comparison required the team to develop a new protocol to stop all communication between the modules – a task beyond the capability of off-the-shelf protocols, McParland said.

Once the data have been collected and an interesting event found, scientists will actually trigger the experiment after the fact, McParland said. This is achieved by breaking apart the collected data and going back through it to find the waveforms depicting the event.

Another problem faced by the developers was dealing with impurities in the glass used in the spheres and photomultiplier tubes. As the impurities decay, they give off light pulses, which meant that the software had to filter out 50 to 100 times more incidental signals than actual scientific data. “There’s just lots of background stuff you have to get rid of,” McParland said.

IceCube will field about 20 times the detector power of its predecessor, another South Pole high-energy neutrino telescope called AMANDA (Antarctic Muon And Neutrino Detector Array). McParland was also part of the AMANDA project, and spent five weeks at the South Pole as part of that effort.

While his group’s role in IceCube will begin to wind down in 2006, for the time being the work is both very demanding and proceeding very well, “McParland said.

“Overall it’s a very interesting computer science problem in that we have a distributed group of people working to put together a good software design with high quality code,” he noted.


About Berkeley Lab

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.