Deep Ensemble as a Gaussian Process Posterior

29 Sep 2021  ·  Zhijie Deng, Feng Zhou, Jianfei Chen, Guoqiang Wu, Jun Zhu ·

Deep Ensemble (DE) is a flexible, feasible, and effective alternative to Bayesian neural networks (BNNs) for uncertainty estimation in deep learning. However, DE is broadly criticized for lacking a proper Bayesian justification. Some attempts try to fix this issue, while they are typically coupled with a regression likelihood or rely on restrictive assumptions. In this work, we propose to define a Gaussian process (GP) approximate posterior with the ensemble members, based on which we perform variational inference directly in the function space. We further develop a function-space posterior regularization mechanism to properly incorporate prior knowledge. We demonstrate the algorithmic benefits of variational inference in the GP family, and provide strategies to make the training feasible. As a result, our method consumes only marginally added training cost than the standard Deep Ensemble. Empirically, our approach achieves better uncertainty estimation than the existing Deep Ensemble and its variants across diverse scenarios.

PDF Abstract
No code implementations yet. Submit your code now

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods