Computational Mathematics and Scientific Computing Seminar

Computationally efficient MCMC using local surrogate models

Speaker: Andrew Davis, Courant, NYU

Location: Warren Weaver Hall 1302

Date: Sept. 17, 2021, 10 a.m.

Synopsis:

Bayesian inference problems often involve target distributions whose density functions are computationally expensive to evaluate. For example, many applications leverage physical models that require numerically solving a PDE. Sampling methods such as Markov chain Monte Carlo (MCMC) require many thousands of model evaluations, which is often computationally prohibitive. Replacing the target density with a surrogate model can significantly reduce the computational expense of MCMC sampling. However, using a fixed surrogate model introduces a bias in the error of the numerical method. Even more worrisome, small errors in the surrogate model can lead to very large errors in the expected values approximated with MCMC samples. We, therefore, introduce a continually refined surrogate model that guarantees asymptotically exact sampling. Furthermore, we devise a new strategy for balancing the decay rate of the bias due to the surrogate with that of the MCMC variance. We prove that the error of the resulting local approximation MCMC (LA-MCMC) algorithm decays at roughly the expected rate for Monte Carlo methods and demonstrate this rate numerically. Finally, we apply LA-MCMC to a computationally intensive Bayesian inverse problem arising in groundwater hydrology.