1 code implementation • 19 Aug 2020 • Alexander Immer, Maciej Korzepa, Matthias Bauer
The generalized Gauss-Newton (GGN) approximation is often used to make practical Bayesian deep learning approaches scalable by replacing a second order derivative with a product of first order derivatives.
1 code implementation • NeurIPS 2019 • Mohammad Emtiyaz Khan, Alexander Immer, Ehsan Abedi, Maciej Korzepa
Deep neural networks (DNN) and Gaussian processes (GP) are two powerful models with several theoretical connections relating them, but the relationship between their training methods is not well understood.