Search Results for author: Toni Karvonen

Found 13 papers, 5 papers with code

Orthonormal Expansions for Translation-Invariant Kernels

no code implementations17 Jun 2022 Filip Tronarp, Toni Karvonen

We present a general Fourier analytic technique for constructing orthonormal basis expansions of translation-invariant kernels from orthonormal bases of $\mathscr{L}_2(\mathbb{R})$.

Translation

Maximum Likelihood Estimation in Gaussian Process Regression is Ill-Posed

no code implementations17 Mar 2022 Toni Karvonen, Chris J. Oates

Gaussian process regression underpins countless academic and industrial applications of machine learning and statistics, with maximum likelihood estimation routinely used to select appropriate parameters for the covariance kernel.

regression

Asymptotic Bounds for Smoothness Parameter Estimates in Gaussian Process Interpolation

no code implementations10 Mar 2022 Toni Karvonen

We prove that the maximum likelihood estimate of the smoothness parameter cannot asymptotically undersmooth the truth when the data are obtained on a fixed bounded subset of $\mathbb{R}^d$.

Black Box Probabilistic Numerics

1 code implementation NeurIPS 2021 Onur Teymur, Christopher N. Foley, Philip G. Breen, Toni Karvonen, Chris. J. Oates

One approach is to model the unknown quantity of interest as a random variable, and to constrain this variable using data generated during the course of a traditional numerical method.

Small Sample Spaces for Gaussian Processes

no code implementations4 Mar 2021 Toni Karvonen

It is known that the membership in a given reproducing kernel Hilbert space (RKHS) of the samples of a Gaussian process $X$ is controlled by a certain nuclear dominance condition.

Gaussian Processes

A probabilistic Taylor expansion with Gaussian processes

no code implementations1 Feb 2021 Toni Karvonen, Jon Cockayne, Filip Tronarp, Simo Särkkä

We study a class of Gaussian processes for which the posterior mean, for a particular choice of data, replicates a truncated Taylor expansion of any order.

Gaussian Processes regression

Semi-Exact Control Functionals From Sard's Method

1 code implementation31 Jan 2020 Leah F. South, Toni Karvonen, Chris Nemeth, Mark Girolami, Chris. J. Oates

The numerical approximation of posterior expected quantities of interest is considered.

Computation Methodology

Maximum likelihood estimation and uncertainty quantification for Gaussian process approximation of deterministic functions

no code implementations29 Jan 2020 Toni Karvonen, George Wynne, Filip Tronarp, Chris. J. Oates, Simo Särkkä

We show that the maximum likelihood estimation of the scale parameter alone provides significant adaptation against misspecification of the Gaussian process model in the sense that the model can become "slowly" overconfident at worst, regardless of the difference between the smoothness of the data-generating function and that expected by the model.

regression Uncertainty Quantification

Taylor Moment Expansion for Continuous-Discrete Gaussian Filtering and Smoothing

2 code implementations8 Jan 2020 Zheng Zhao, Toni Karvonen, Roland Hostettler, Simo Särkkä

The paper is concerned with non-linear Gaussian filtering and smoothing in continuous-discrete state-space models, where the dynamic model is formulated as an It\^{o} stochastic differential equation (SDE), and the measurements are obtained at discrete time instants.

Improved Calibration of Numerical Integration Error in Sigma-Point Filters

no code implementations28 Nov 2018 Jakub Prüher, Toni Karvonen, Chris. J. Oates, Ondřej Straka, Simo Särkkä

The sigma-point filters, such as the UKF, which exploit numerical quadrature to obtain an additional order of accuracy in the moment transformation step, are popular alternatives to the ubiquitous EKF.

Numerical Integration Uncertainty Quantification

Symmetry Exploits for Bayesian Cubature Methods

1 code implementation26 Sep 2018 Toni Karvonen, Simo Särkkä, Chris. J. Oates

Bayesian cubature provides a flexible framework for numerical integration, in which a priori knowledge on the integrand can be encoded and exploited.

Methodology Numerical Analysis Computation

Student-t Process Quadratures for Filtering of Non-Linear Systems with Heavy-Tailed Noise

no code implementations15 Mar 2017 Jakub Prüher, Filip Tronarp, Toni Karvonen, Simo Särkkä, Ondřej Straka

Advantage of the Student- t process quadrature over the traditional Gaussian process quadrature, is that the integral variance depends also on the function values, allowing for a more robust modelling of the integration error.

Cannot find the paper you are looking for? You can Submit a new open access paper.