Paper

Adaptive Gaussian process surrogates for Bayesian inference

We present an adaptive approach to the construction of Gaussian process surrogates for Bayesian inference with expensive-to-evaluate forward models. Our method relies on the fully Bayesian approach to training Gaussian process models and utilizes the expected improvement idea from Bayesian global optimization. We adaptively construct training designs by maximizing the expected improvement in fit of the Gaussian process model to the noisy observational data. Numerical experiments on model problems with synthetic data demonstrate the effectiveness of the obtained adaptive designs compared to the fixed non-adaptive designs in terms of accurate posterior estimation at a fraction of the cost of inference with forward models.

Results in Papers With Code
(↓ scroll down to see all results)