Science Journal Features Research On Petascale Enabling Technologies
CRD-led SciDAC Projects Highlighted
December 5, 2007
The November 2007 issue of the Cyberinfrastructure Technology Watch (CTWatch) Quarterly included significant contributions – five out of the nine articles – from CRD researchers, who authored papers on code performance, software tools, visualization, scientific data management and data placement solutions for distributed petascale science.
The issue featured articles written by dozens of researchers about SciDAC’s Centers for Enabling Technologies. SciDAC (Scientific Discovery through Advanced Computing), run by the DOE Office of Science, funds projects that investigate obstacles for carrying out petascale computing and develop hardware and software tools to solve them.
The quarterly is published by CTWatch, an online venue for science and engineering community on cyberinfrastructure technology. CTWatch is supported by the National Science Foundation’s Cyberinfrastructure Partnership between the San Diego Supercomputing Center (SDSC) and the National Center for Supercomputing Applications (NCSA). SDSC and NCSA run the forum along with the Innovative Computing Laboratory at the University of Tennessee.
Nine CRD researchers contributed to the current CTWatch Quarterly. David Bailey, chief technologist in CRD, was lead author for the article “Performance Engineering: Understanding and Improving the Performance of Large-Scale Codes.” The article identified issues that hamper code performance and ways to help scientists improve their codes and get more out of supercomputers: performance modeling of applications and systems; automatic performance tuning; and application engagement and tuning.
Bailey and his fellow researchers in SciDAC’s Performance Engineering Research Institute noted that researchers often devote more programming efforts on analyzing results instead of improving code performance. Yet improving code performance could lead to huge savings. “A quick calculation shows that if one can increase by just 30 percent the performance of two of the major SciDAC applications codes (which together use, say, 10 percent of the NERSC and ORNL high-end systems over three years), this represents a savings of some $6 million,” the article said.
Kathy Yelick, head of CRD’s Future Technologies Group and incoming director of the National Energy Research Scientific Computing Center, as well as Dan Gunter in the Cyber Infrastructure Development group co-authored the article.
Yelick also contributed to the article “Creating Software Tools and Libraries for Leadership Computing.”
Wes Bethel, head of CRD’s Visualization group, authored the piece “DOE’s SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) – Strategies for Petascale Visual Data Analysis Success.” The article outlined the role and techniques of visualization in transforming large sets of observed or simulated data into images that explain the scientific discoveries. Bethel, who is co-PI for the VACET project, gave examples of the types of research that have benefited from using visualization tools, such as astrophysics, combustion and particle physics.
“As a Center for Enabling Technology, VACET’s mission is the creation of usable, production-quality visualization and knowledge discovery software infrastructure that runs on large, parallel computer systems at DOE’s Open Computing facilities, and that provides solutions to challenging visual data exploration and knowledge discovery needs of modern science, particularly the DOE science community.”
Several researchers from Bethel’s group contributed to the article: Cecilia Aragon, Prabhat and Gunther Weber.
Arie Shoshani, head of SciDAC’s Scientific Data Management Center, was lead author in the article called “Scientific Data Management: Essential Technology for Accelerating Scientific Discoveries.” The piece discussed the software tools the center has developed for managing and analyzing large quantities of data, from organizing files to getting search results promptly.
One of the search tools highlighted by Shoshani was FastBit, which uses a compressed bitmap to present indexed data. “FastBit is 12 times faster than any known compressed bitmap index in answering range queries,” Shoshani wrote. “Because of its speed, FastBit facilitates real-time analysis of data, searching over billions of data values in seconds.”
Finally, Dan Gunter and Brian Tierney were co-authors in the article “End-to-End Data Solutions for Distributed Petascale Science.” The article described the work by SciDAC’s Center for Enabling Distributed Petascale Science, which focuses on developing software tools for quickly transferring and retrieving data, as well as monitoring and troubleshooting any issues during the process.
About Berkeley Lab
Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.