I am a postdoctoral researcher in the Center for Computational Sciences and Engineering at the Lawrence Berkeley National Laboratory. His research focuses on developing algorithms for optimization problems with computationally expensive simulation constraints that are subject to uncertainty.
The current focus of my work is on solving parameter inference problems when the cost of forward model evaluations is high. I utilize Gaussian process models as computationally inexpensive surrogates of the forward model.
The novelty of my work is in using an adaptive approach to refining the Gaussian process surrogate in a principled way by utilizing ideas from Bayesian optimization and employing the uncertainty estimates that the Gaussian process framework provides. The developed approach requires solving an auxiliary optimization problem which involves only evaluations of the surrogate model, and, thus, has a negligible computational cost. The goal of the optimization step is to propose a new input parameter to evaluate the forward model which can then be added to the training set. Proceeding iteratively the algorithm obtains a ``localized'' Gaussian process surrogate that is appropriate for solving the original inference problem without requiring a high accuracy of the surrogate globally. The obtained surrogate is used for generating the samples from the posterior distribution of the parameters of interest at a low cost enabling further analysis and calibration tasks. For details of the method see the pre-print in the Publications section or the presentation in the Talks section.