Skip to navigation Skip to content
Careers | Phone Book | A - Z Index
Integrated Data Frameworks Group

Ludovico Bianchi

Ludovico Bianchi
Computer Systems Engineer
Ludovico is a software developer (Computer Systems Engineer 2) in the Integrated Data Frameworks group. His current activities include: software development coordination, release engineering, and development of testing and Continuous Integration/Continuous Delivery (CI/CD) infrastructure for the IDAES, CCSI2, WaterTAP, PARETO, and DISPATCHES projects; and the exploration and development of data pipelines, analysis techniques, and anomaly detection solutions for large-scale operational data analytics with the Operations Technology Group at NERSC.
Previously, he was part of the core development team on the Deduce and Science Capsule projects.
His scientific background is in computing and data analysis for experimental particle physics. He holds a B.Sc. in Physics and M.Sc. in Nuclear and Subnuclear Physics from the University of Rome "Tor Vergata".
His previous research activities in physics include the CDF experiment at Fermilab, Chicago; and the future PANDA experiment at FAIR, Darmstadt, Germany, where he worked on the development of parallel algorithms for particle tracking and data analysis at the Research Center Jülich, Germany.
In addition, he has involved in initiatives for data science in the areas of drinking water affordability in collaboration with the California State Water Boards and the Water Data Collective of UC Berkeley.

Journal Articles

Devarshi Ghoshal, Ludovico Bianchi, Abdelilah Essiari, Michael Beach, Drew Paine, Lavanya Ramakrishnan, "Science Capsule - Capturing the Data Life Cycle", Journal of Open Source Software, 2021, 6:2484, doi: 10.21105/joss.02484

Conference Papers

Devarshi Ghoshal, Ludovico Bianchi, Abdelilah Essiari, Drew Paine, Sarah Poon, Michael Beach, Alpha N'Diaye, Patrick Huck, Lavanya Ramakrishnan, "Science Capsule: Towards Sharing and Reproducibility of Scientific Workflows", 2021 IEEE Workshop on Workflows in Support of Large-Scale Science (WORKS), November 15, 2021, doi: 10.1109/WORKS54523.2021.00014

Workflows are increasingly processing large volumes of data from scientific instruments, experiments and sensors. These workflows often consist of complex data processing and analysis steps that might include a diverse ecosystem of tools and also often involve human-in-the-loop steps. Sharing and reproducing these workflows with collaborators and the larger community is critical but hard to do without the entire context of the workflow including user notes and execution environment. In this paper, we describe Science Capsule, which is a framework to capture, share, and reproduce scientific workflows. Science Capsule captures, manages and represents both computational and human elements of a workflow. It automatically captures and processes events associated with the execution and data life cycle of workflows, and lets users add other types and forms of scientific artifacts. Science Capsule also allows users to create `workflow snapshots' that keep track of the different versions of a workflow and their lineage, allowing scientists to incrementally share and extend workflows between users. Our results show that Science Capsule is capable of processing and organizing events in near real-time for high-throughput experimental and data analysis workflows without incurring any significant performance overheads.


Payton Linton, William Melodia, Alina Lazar, Deborah Agarwal, Ludovico Bianchi, Devarshi Ghoshal, Gilberto Pastorello, Lavanya Ramakrishnan, Kesheng Wu, Understanding Data Similarity in Large-Scale Scientific Datasets, 2019 IEEE International Conference on Big Data (Big Data), Pages: 4525--4531 2019,