no code implementations • 18 Apr 2023 • George Wynne
This manuscript studies the application of Bayes Hilbert spaces to the posterior approximation problem.
1 code implementation • 9 Jun 2022 • George Wynne, Mikołaj Kasprzak, Andrew B. Duncan
Kernel Stein discrepancy (KSD) is a widely used kernel-based measure of discrepancy between probability measures.
1 code implementation • 7 Feb 2022 • Xing Liu, Harrison Zhu, Jean-François Ton, George Wynne, Andrew Duncan
Stein variational gradient descent (SVGD) is a deterministic particle inference algorithm that provides an efficient alternative to Markov chain Monte Carlo.
no code implementations • 25 Oct 2021 • Veit Wild, George Wynne
Variational Gaussian process (GP) approximations have become a standard tool in fast GP inference.
no code implementations • 26 May 2021 • George Wynne, Stanislav Nagy
Statistical depth is the act of gauging how representative a point is compared to a reference probability measure.
1 code implementation • 25 Aug 2020 • George Wynne, Andrew B. Duncan
We propose a nonparametric two-sample test procedure based on Maximum Mean Discrepancy (MMD) for testing the hypothesis that two samples of functions have the same underlying distribution, using kernels defined on function spaces.
no code implementations • 29 Jan 2020 • Toni Karvonen, George Wynne, Filip Tronarp, Chris. J. Oates, Simo Särkkä
We show that the maximum likelihood estimation of the scale parameter alone provides significant adaptation against misspecification of the Gaussian process model in the sense that the model can become "slowly" overconfident at worst, regardless of the difference between the smoothness of the data-generating function and that expected by the model.
no code implementations • 29 Jan 2020 • George Wynne, François-Xavier Briol, Mark Girolami
In this setting, an important theoretical question of practial relevance is how accurate the Gaussian process approximations will be given the difficulty of the problem, our model and the extent of the misspecification.