no code implementations • 22 Feb 2021 • Filip de Roos, Carl Jidling, Adrian Wills, Thomas Schön, Philipp Hennig
Machine learning practitioners invest significant manual and computational resources in finding suitable learning rates for optimization algorithms.
no code implementations • 14 Dec 2020 • Jarrad Courts, Johannes Hendriks, Adrian Wills, Thomas Schön, Brett Ninness
In this work, a variational approach is used to provide an assumed density which approximates the desired, intractable, distribution.
no code implementations • 8 Dec 2020 • Jarrad Courts, Adrian Wills, Thomas Schön, Brett Ninness
This paper considers parameter estimation for nonlinear state-space models, which is an important but challenging problem.
no code implementations • 13 Mar 2020 • Maria Bånkestad, Jens Sjölund, Jalil Taghia, Thomas Schön
We present the elliptical processes -- a family of non-parametric probabilistic models that subsumes the Gaussian process and the Student-t process.
1 code implementation • 5 Feb 2020 • Johannes Hendriks, Carl Jidling, Adrian Wills, Thomas Schön
We present a novel approach to modelling and learning vector fields from physical systems using neural networks that explicitly satisfy known linear operator constraints.
no code implementations • 25 Sep 2019 • Jalil Taghia, Maria Bånkestad, Fredrik Lindsten, Thomas Schön
Models that output a vector of responses given some inputs, in the form of a conditional mean vector, are at the core of machine learning.
no code implementations • 3 Sep 2019 • Adrian Wills, Thomas Schön
In this paper we present a novel quasi-Newton algorithm for use in stochastic optimisation.
no code implementations • 12 Feb 2018 • Adrian Wills, Thomas Schön
We provide a numerically robust and fast method capable of exploiting the local geometry when solving large-scale stochastic optimisation problems.
no code implementations • NeurIPS 2012 • Fredrik Lindsten, Thomas Schön, Michael. I. Jordan
We present a novel method in the family of particle MCMC methods that we refer to as particle Gibbs with ancestor sampling (PG-AS).