Berkeley Lab mathematicians have developed a new approach to machine learning aimed at experimental imaging data. Rather than relying on the tens or hundreds of thousands of images used by typical machine learning methods, this new approach “learns” much more quickly and requires far fewer images.
“Minimalist Machine Learning” Algorithms
Delivering Efficient Parallel I/O with HDF5
CRD Researchers Help Make Blockchains More Robust
Big Scientific Array Data Analysis with ArrayUDF
In an interview with Department of Energy's ECP communications team, Berkeley Lab's Suren Byna and Quincey Koziol talk about delivering efficient parallel I/O with HDF5 on exascale computing systems.
In the last few years, researchers at Berkeley Lab, UC Davis and University of Stavanger in Norway have developed a new protocol, called BChain, which makes private blockchain even more robust. The researchers are also working with colleagues at Berkeley Lab and beyond to adapt this tool to support applications that are of strategic importance to the Department of Energy’s Office of Science.
A novel scalable framework developed by CRD researchers is improving scientific productivity by allowing researchers to run user-defined custom analysis operations on large arrays of data with massively parallel supercomputers, while leaving complex data management and performance optimization tasks up to the underlying system.