CRD’s John Shalf Discusses Big Data with White House Staff
March 31, 2014
John Shalf, head of CRD’s Computer and Data Sciences Department, was one of four national laboratory representatives to meet with President Obama’s chief of staff, and members of the Office of Science and Technology Policy (OSTP), Office of Management and Budget (OMB) and other federal agencies to discuss the intersection of big data architecture requirements and exascale challenges. The March 25 meeting was held in the Eisenhower Executive Office Building next to the White House, which houses most of the White House staff.
Among the points Shalf made were:
- Both experimental data and models essential for the scientific discovery process – finding correlations in experimental data alone is insufficient. Computational models work together with experimental data to enable exploration of the underlying mechanisms that create correlations in big data, giving confidence that the findings are true, which can be used to support policy decisions.
- Since models and data are critical, we also need supercomputers that are good at tackling both of these areas. This is as true for “big science” experimental data in the Office of Science as it is for graph analytics for the intelligence community and data mining in business analytics.
- Because of technology changes, data movement and data storage has become a huge design challenge that brings cloud computing (Big Data) and HPC and DOE together.
DOE has a long history of deep involvement in machine design and collaboration with the hardware industry to meet its mission needs in HPC. - DOE is a trusted partner with the computing industry because it is neither a regulator nor a competitor. Industry benefits from the collaboration because it enhances their competitiveness.
- Ensuring the availability of future competitive computing systems for Big Data requires the co-design process that DOE is promoting in its exascale program, and DOE is the best and most open place to carry that process out. The transformation of computing will impact both cloud computing and HPC.
Shalf, who also serves as NERSC's Chief Technologist, was invited to the meeting due to his role as co-leader of the Computer Architecture Lab (CAL), a project to develop energy efficient and effective processor and memory architecture R&D for DOE’s Exascale program. Presenting with Shalf were Jim Brase, Deputy Associate Director for Data Science at Lawrence Livermore; Jim Ahrens, Data Science at Scale Team Leader at Los Alamos; and Bill Hart, Distinguished Member of Technical Staff at Sandia. They discussed the large-scale experimental facilities operated by DOE and the massive datasets generated by the facilities, the data processing capabilities needed to filter and analyze these datasets, as well as the importance of computational models to understand the underlying mechanisms and make sense of what is seen. The speakers also talked about computational models and experimental data that are both equally important to the scientific discovery process.
Among the officials attending the meeting were John Podesta, Senior Advisor to President Obama; Christopher M. Kirchoff, Office of the Chairman of the Joint Chiefs of State; Pat Falcone and Rob Leland of OSTP; David Edelman: Senior Advisor for Internet, Innovation and Privacy to the White House; Dimitri Kusnezov, Director of the Office of Science and Policy at NNSA; Kendall Burman and Justin Antonipillai, Department of Commerce; Jared E. Goodman, Department of State; and Marjory Blumenthal, Executive Director of the President’s Council of Advisors on Science and Technology.
CRD’s John Shalf Discusses Big Data with White House Staff in March 25 Meeting
John Shalf, head of CRD’s Computer and Data Sciences Department, was one of four national laboratory representatives to meet with President Obama’s chief of staff, and members of the Office of Science and Technology Policy (OSTP), Office of Management and Budget (OMB) and other federal agencies to discuss the intersection of big data architecture requirements and exascale challenges. The meeting was held in the Eisenhower Executive Office Building next to the White House, which houses most of the White House staff.
Among the points Shalf made were:
- Both experimental data and models essential for the scientific discovery process – finding correlations in experimental data alone is insufficient. Computational models work together with experimental data to enable exploration of the underlying mechanisms that create correlations in big data, giving confidence that the findings are true, which can be used to support policy decisions.
- Since models and data are critical, we also need supercomputers that are good at tackling both of these areas. This is as true for “big science” experimental data in the Office of Science as it is for graph analytics for the intelligence community and data mining in business analytics.
- Because of technology changes, data movement and data storage has become a huge design challenge that brings cloud computing (Big Data) and HPC and DOE together.
DOE has a long history of deep involvement in machine design and collaboration with the hardware industry to meet its mission needs in HPC. - DOE is a trusted partner with the computing industry because it is neither a regulator nor a competitor. Industry benefits from the collaboration because it enhances their competitiveness.
- Ensuring the availability of future competitive computing systems for Big Data requires the co-design process that DOE is promoting in its exascale program, and DOE is the best and most open place to carry that process out. The transformation of computing will impact both cloud computing and HPC.
Presenting with Shalf were Jim Brase, Deputy Associate Director for Data Science at Lawrence Livermore; Jim Ahrens, Data Science at Scale Team Leader at Los Alamos; and Bill Hart, Distinguished Member of Technical Staff at Sandia. They discussed the large-scale experimental facilities operated by DOE and the massive datasets generated by the facilities, the data processing capabilities needed to filter and analyze these datasets, the importance of computational models to understand the underlying mechanisms and make sense of what is seen, and that computational models and experimental data are both equally important to the scientific discovery process.
Among the officials attending the meeting were John Podesta, Senior Advisor to President Obama; Christopher M. Kirchoff, Office of the Chairman of the Joint Chiefs of State; Pat Falcone and Rob Leland of OSTP; David Edelman: Senior Advisor for Internet, Innovation and Privacy to the White House; Dimitri Kusnezov, Director of the Office of Science and Policy at NNSA; Kendall Burman and Justin Antonipillai, Department of Commerce; Jared E. Goodman, Department of State; and Marjory Blumenthal, Executive Director of the President’s Council of Advisors on Science and Technology.
About Berkeley Lab
Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.