no code implementations • 22 May 2023 • Katharina Ott, Michael Tiemann, Philipp Hennig
As a first contribution, we show that basic and lightweight Bayesian deep learning techniques like the Laplace approximation can be applied to neural ODEs to yield structured and meaningful uncertainty quantification.
1 code implementation • 22 May 2023 • Katharina Ott, Michael Tiemann, Philipp Hennig, François-Xavier Briol
Bayesian probabilistic numerical methods for numerical integration offer significant advantages over their non-Bayesian counterparts: they can encode prior information about the integrand, and can quantify uncertainty over estimates of an integral.
no code implementations • 27 Feb 2023 • Katharina Ensinger, Sebastian Ziesche, Barbara Rakitsch, Michael Tiemann, Sebastian Trimpe
This filtering technique combines two signals by applying a high-pass filter to one signal, and low-pass filtering the other.
no code implementations • 2 Feb 2021 • Katharina Ensinger, Friedrich Solowjow, Sebastian Ziesche, Michael Tiemann, Sebastian Trimpe
On the other hand, classical numerical integrators are specifically designed to preserve these crucial properties through time.
no code implementations • ICLR 2021 • Katharina Ott, Prateek Katiyar, Philipp Hennig, Michael Tiemann
If the trained model is supposed to be a flow generated from an ODE, it should be possible to choose another numerical solver with equal or smaller numerical error without loss of performance.
1 code implementation • 30 Jul 2020 • Katharina Ott, Prateek Katiyar, Philipp Hennig, Michael Tiemann
If the trained model is supposed to be a flow generated from an ODE, it should be possible to choose another numerical solver with equal or smaller numerical error without loss of performance.
no code implementations • ICML 2020 • Hans Kersting, Nicholas Krämer, Martin Schiegg, Christian Daniel, Michael Tiemann, Philipp Hennig
To address this shortcoming, we employ Gaussian ODE filtering (a probabilistic numerical method for ODEs) to construct a local Gaussian approximation to the likelihood.