Skip to navigation Skip to content
Careers | Phone Book | A - Z Index

Edison Early Users Deliver Results

January 31, 2014

By Margie Wylie

Before any supercomputer is accepted at NERSC, scientists are invited to put the system through its paces during an “early science” phase. While the main aim of this period is to test the new system, many scientists are able to use the time to significantly advance their work. (»Related story: "Edison Electrifies Scientific Computing.") Two Berkeley Lab computational researchers were among those who pushed the limits of the new system while making advances in their fields.

The Fate of Sequestered CO2 

David Trebotich is modeling the effects of sequestering carbon dioxide (CO2) deep underground. The aim is to better understand the physical and chemical interactions between CO2, rocks and the minute, saline-filled pores through which the gas migrates. This information will help scientists understand how much we can rely on geologic sequestration as a means of reducing greenhouse gas emissions, which cause climate change.

The fine detail of this simulationwhich shows computed pH on calcite grains at

The fine detail of this simulation—which shows computed pH on calcite grains at 1 micron resolution—is necessary to better understand what happens when the greenhouse gas carbon dioxide is injected underground rather than being released into the atmosphere to exacerbate climate change. (Image: David Trebotich)

Unlike today’s large-scale models, which are unable to resolve microscale features, Trebotich models the physical and chemical reactions happening at resolutions of hundreds of nanometers to tens of microns. His simulations cover only a tiny area—a tube just a millimeter wide and not even a centimeter long—but in exquisitely fine detail. 

“We’re definitely dealing with big data and extreme computing,” said Trebotich. “The code, Chombo-Crunch, generates datasets of one terabyte for just a single, 100 microsecond time-step, and we need to do 16 seconds of that to match a ‘flow-through’ experiment,” he said. Carried out by the Energy Frontier Research Center for Nanoscale Control of Geologic CO2, the experiment captured effluent concentrations due to injecting a solution of dissolved carbon dioxide through a tube of crushed calcite.

Edison’s high memory-per-node architecture means that more of each calculation (and the resulting temporary data) can be stored close to the processors working on it. An early user of Edison, Trebotich was invited to run his codes to help test the machine.The researcher’s simulations are running 2.5 times faster than on Hopper, the previous flagship system, reducing the time it takes him to get a solution from months to just weeks.

 “The memory bandwidth makes a huge difference in performance for our code,” Trebotich said. In fact, he expects his code will actually run slower on other supercomputer architectures with many more processors, but less memory per processing core. Trebotich also pointed out that he only needed to make minimal changes to the code to move it from Hopper to Edison.

The aim is to eventually merge such finely detailed modeling results with large-scale simulations for more accurate models. Trebotich is also working on extending his simulations to shale gas extraction using hydraulic fracturing. The code framework could also be used for other energy applications, such as electrochemical transport in batteries.

Our Universe: The Big Picture 

Zarija Lukic works at the other end of the scale. A cosmologist with Berkeley Lab’s Computational Cosmology Center (C3), Lukic models mega-parsecs of space in an attempt to understand the large-scale structure of the universe.

Using the Nyx code on Edison, scientists were able to run the largest simulation

Using the Nyx code on Edison, scientists were able to run the largest simulation of its kind (370 million light years on each side) showing neutral hydrogen in the large-scale structure of the universe. The blue webbing in the image represents gas responsible for Lyman-alpha forest signal. The yellow are regions of higher density, where galaxy formation takes place (Image: Casey Stark).

Since we can’t step outside our own universe to see its structure, cosmologists like Lukic examine the light from distant quasars and other bright light sources for clues.

When light from these distant quasars passes through clouds of hydrogen, the gas leaves a distinctive pattern in the light’s spectrum. By studying these patterns (which look like clusters of squiggles called a Lyman alpha forest), scientists can better understand what lies between us and the light source, revealing the process of structure formation in the universe.

Using the Nyx code developed by Berkeley Lab’s Center for Computational Sciences and Engineering, Lukic and colleagues are creating virtual “what-if” universes to help cosmologists fill in the blanks in their observations.

Researchers use a variety of possible cosmological models (something like universal recipes), calculating for each the interplay between dark energy, dark matter, and the baryons that flow into gravity wells to become stars and galaxies. Cosmologists can then compare these virtual universes with real observations.

“The ultimate goal is to find a single physical model that fits not just Lyman alpha forest observations, but all the data we have about the nature of matter and the universe, from cosmic microwave background measurements to results from experiments like the Large Hadron Collider,” Lukic said.

Working with 2 million early hours on Edison, Lukic and collaborators performed one of the largest Lyman alpha forest simulations to date: the equivalent of a cube measuring 370 million light years on each side. “With Nyx on Edison we're able—for the first time—to get to volumes of space large enough and with resolution fine enough to create models that aren’t thrown off by numerical biases,” he said. Lukic and his collaborators are preparing their results for publication. Lukic expects his work on Edison will become “the gold standard for Lyman alpha forest simulations.”

Large-scale simulations such as these will be key to interpreting the data from many upcoming observational missions, including the Dark Energy Spectroscopic Instrument (DESI). The work supports the Dark Universe project, part of the Department of Energy’s Scientific Discovery through Advanced Computing (SciDAC) program.


About Berkeley Lab

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.