1 code implementation • 16 Mar 2024 • Aidan Scannell, Riccardo Mereu, Paul Chang, Ella Tamir, Joni Pajarinen, Arno Solin
Our parameterization offers: (i) a way to scale function-space methods to large data sets via sparsification, (ii) retention of prior knowledge when access to past data is limited, and (iii) a mechanism to incorporate new data without retraining.
2 code implementations • 5 Sep 2023 • Aidan Scannell, Riccardo Mereu, Paul Chang, Ella Tamir, Joni Pajarinen, Arno Solin
Deep neural networks (NNs) are known to lack uncertainty estimates and struggle to incorporate new data.
1 code implementation • 31 Jan 2023 • Ella Tamir, Martin Trapp, Arno Solin
We integrate Bayesian filtering and optimal control into learning the diffusion process, enabling the generation of constrained stochastic processes governed by sparse observations at intermediate stages and terminal constraints.
1 code implementation • NeurIPS 2021 • Arno Solin, Ella Tamir, Prakhar Verma
Simulation-based techniques such as variants of stochastic Runge-Kutta are the de facto approach for inference with stochastic differential equations (SDEs) in machine learning.