no code implementations • 17 Jun 2022 • Filip Tronarp, Toni Karvonen
We present a general Fourier analytic technique for constructing orthonormal basis expansions of translation-invariant kernels from orthonormal bases of $\mathscr{L}_2(\mathbb{R})$.
no code implementations • 17 Mar 2022 • Toni Karvonen, Chris J. Oates
Gaussian process regression underpins countless academic and industrial applications of machine learning and statistics, with maximum likelihood estimation routinely used to select appropriate parameters for the covariance kernel.
no code implementations • 10 Mar 2022 • Toni Karvonen
We prove that the maximum likelihood estimate of the smoothness parameter cannot asymptotically undersmooth the truth when the data are obtained on a fixed bounded subset of $\mathbb{R}^d$.
1 code implementation • 3 Dec 2021 • Jonathan Wenger, Nicholas Krämer, Marvin Pförtner, Jonathan Schmidt, Nathanael Bosch, Nina Effenberger, Johannes Zenn, Alexandra Gessner, Toni Karvonen, François-Xavier Briol, Maren Mahsereci, Philipp Hennig
Probabilistic numerical methods (PNMs) solve numerical problems via probabilistic inference.
1 code implementation • NeurIPS 2021 • Onur Teymur, Christopher N. Foley, Philip G. Breen, Toni Karvonen, Chris. J. Oates
One approach is to model the unknown quantity of interest as a random variable, and to constrain this variable using data generated during the course of a traditional numerical method.
no code implementations • 4 Mar 2021 • Toni Karvonen
It is known that the membership in a given reproducing kernel Hilbert space (RKHS) of the samples of a Gaussian process $X$ is controlled by a certain nuclear dominance condition.
no code implementations • 1 Feb 2021 • Toni Karvonen, Jon Cockayne, Filip Tronarp, Simo Särkkä
We study a class of Gaussian processes for which the posterior mean, for a particular choice of data, replicates a truncated Taylor expansion of any order.
1 code implementation • 31 Jan 2020 • Leah F. South, Toni Karvonen, Chris Nemeth, Mark Girolami, Chris. J. Oates
The numerical approximation of posterior expected quantities of interest is considered.
Computation Methodology
no code implementations • 29 Jan 2020 • Toni Karvonen, George Wynne, Filip Tronarp, Chris. J. Oates, Simo Särkkä
We show that the maximum likelihood estimation of the scale parameter alone provides significant adaptation against misspecification of the Gaussian process model in the sense that the model can become "slowly" overconfident at worst, regardless of the difference between the smoothness of the data-generating function and that expected by the model.
2 code implementations • 8 Jan 2020 • Zheng Zhao, Toni Karvonen, Roland Hostettler, Simo Särkkä
The paper is concerned with non-linear Gaussian filtering and smoothing in continuous-discrete state-space models, where the dynamic model is formulated as an It\^{o} stochastic differential equation (SDE), and the measurements are obtained at discrete time instants.
no code implementations • 28 Nov 2018 • Jakub Prüher, Toni Karvonen, Chris. J. Oates, Ondřej Straka, Simo Särkkä
The sigma-point filters, such as the UKF, which exploit numerical quadrature to obtain an additional order of accuracy in the moment transformation step, are popular alternatives to the ubiquitous EKF.
1 code implementation • 26 Sep 2018 • Toni Karvonen, Simo Särkkä, Chris. J. Oates
Bayesian cubature provides a flexible framework for numerical integration, in which a priori knowledge on the integrand can be encoded and exploited.
Methodology Numerical Analysis Computation
no code implementations • 15 Mar 2017 • Jakub Prüher, Filip Tronarp, Toni Karvonen, Simo Särkkä, Ondřej Straka
Advantage of the Student- t process quadrature over the traditional Gaussian process quadrature, is that the integral variance depends also on the function values, allowing for a more robust modelling of the integration error.