Search Results for author: ST John

Found 16 papers, 5 papers with code

Nonparametric modeling of the composite effect of multiple nutrients on blood glucose dynamics

1 code implementation6 Nov 2023 Arina Odnoblyudova, Çağlar Hızlı, ST John, Andrea Cognolato, Anne Juuti, Simo Särkkä, Kirsi Pietiläinen, Pekka Marttinen

By differentiating treatment components, incorporating their dosages, and sharing statistical information across patients via a hierarchical multi-output Gaussian process, our method improves prediction accuracy over existing approaches, and allows us to interpret the different effects of carbohydrates and fat on the overall glucose response.

Beyond Intuition, a Framework for Applying GPs to Real-World Data

1 code implementation6 Jul 2023 Kenza Tazi, Jihao Andreas Lin, Ross Viljoen, Alex Gardner, ST John, Hong Ge, Richard E. Turner

Gaussian Processes (GPs) offer an attractive method for regression over small, structured and correlated datasets.

Gaussian Processes regression

Improving Hyperparameter Learning under Approximate Inference in Gaussian Process Models

1 code implementation7 Jun 2023 Rui Li, ST John, Arno Solin

Approximate inference in Gaussian process (GP) models with non-conjugate likelihoods gets entangled with the learning of the model hyperparameters.

Hyperparameter Optimization Variational Inference

Towards Improved Learning in Gaussian Processes: The Best of Two Worlds

no code implementations11 Nov 2022 Rui Li, ST John, Arno Solin

Gaussian process training decomposes into inference of the (approximate) posterior and learning of the hyperparameters.

Binary Classification Gaussian Processes +3

Fantasizing with Dual GPs in Bayesian Optimization and Active Learning

no code implementations2 Nov 2022 Paul E. Chang, Prakhar Verma, ST John, Victor Picheny, Henry Moss, Arno Solin

Gaussian processes (GPs) are the main surrogate functions used for sequential modelling such as Bayesian Optimization and Active Learning.

Active Learning Bayesian Optimization +1

Causal Modeling of Policy Interventions From Sequences of Treatments and Outcomes

no code implementations9 Sep 2022 Çağlar Hızlı, ST John, Anne Juuti, Tuure Saarinen, Kirsi Pietiläinen, Pekka Marttinen

Our model enables the estimation of a treatment policy from observational sequences of treatments and outcomes, and it can predict the interventional and counterfactual progression of the outcome after an intervention on the treatment policy (in contrast with the causal effect of a single treatment).

counterfactual Decision Making +3

Non-separable Spatio-temporal Graph Kernels via SPDEs

no code implementations16 Nov 2021 Alexander Nikitin, ST John, Arno Solin, Samuel Kaski

Gaussian processes (GPs) provide a principled and direct approach for inference and learning on graphs.

Gaussian Processes

A Tutorial on Sparse Gaussian Processes and Variational Inference

no code implementations27 Dec 2020 Felix Leibfried, Vincent Dutordoir, ST John, Nicolas Durrande

In this context, a convenient choice for approximate inference is variational inference (VI), where the problem of Bayesian inference is cast as an optimization problem -- namely, to maximize a lower bound of the log marginal likelihood.

Bayesian Inference Gaussian Processes +2

Amortized variance reduction for doubly stochastic objectives

no code implementations9 Mar 2020 Ayman Boustati, Sattar Vakili, James Hensman, ST John

Approximate inference in complex probabilistic models such as deep Gaussian processes requires the optimisation of doubly stochastic objective functions.

Gaussian Processes

A Framework for Interdomain and Multioutput Gaussian Processes

1 code implementation2 Mar 2020 Mark van der Wilk, Vincent Dutordoir, ST John, Artem Artemev, Vincent Adam, James Hensman

One obstacle to the use of Gaussian processes (GPs) in large-scale problems, and as a component in deep learning system, is the need for bespoke derivations and implementations for small variations in the model or inference.

Gaussian Processes

Variational Gaussian Process Models without Matrix Inverses

no code implementations pproximateinference AABI Symposium 2019 Mark van der Wilk, ST John, Artem Artemev, James Hensman

We present a variational approximation for a wide range of GP models that does not require a matrix inverse to be performed at each optimisation step.

Gaussian Process Modulated Cox Processes under Linear Inequality Constraints

no code implementations28 Feb 2019 Andrés F. López-Lopera, ST John, Nicolas Durrande

We introduce a novel finite approximation of GP-modulated Cox processes where positiveness conditions can be imposed directly on the GP, with no restrictions on the covariance function.

Point Processes

Scalable GAM using sparse variational Gaussian processes

no code implementations28 Dec 2018 Vincent Adam, Nicolas Durrande, ST John

Generalized additive models (GAMs) are a widely used class of models of interest to statisticians as they provide a flexible way to design interpretable models of data beyond linear models.

Additive models Gaussian Processes +1

Learning Invariances using the Marginal Likelihood

no code implementations NeurIPS 2018 Mark van der Wilk, Matthias Bauer, ST John, James Hensman

Generalising well in supervised learning tasks relies on correctly extrapolating the training data to a large region of the input space.

Data Augmentation Gaussian Processes +2

Cannot find the paper you are looking for? You can Submit a new open access paper.