Search Results for author: Edwin V. Bonilla

Found 25 papers, 7 papers with code

Addressing Over-Smoothing in Graph Neural Networks via Deep Supervision

no code implementations25 Feb 2022 Pantelis Elinas, Edwin V. Bonilla

Learning useful node and graph representations with graph neural networks (GNNs) is a challenging task.

Graph Property Prediction

Optimizing Sequential Experimental Design with Deep Reinforcement Learning

no code implementations2 Feb 2022 Tom Blau, Edwin V. Bonilla, Amir Dezfouli, Iadine Chades

Bayesian approaches developed to solve the optimal design of sequential experiments are mathematically elegant but computationally challenging.

Experimental Design reinforcement-learning

Learning ODEs via Diffeomorphisms for Fast and Robust Integration

no code implementations4 Jul 2021 Weiming Zhi, Tin Lai, Lionel Ott, Edwin V. Bonilla, Fabio Ramos

Advances in differentiable numerical integrators have enabled the use of gradient descent techniques to learn ordinary differential equations (ODEs).

Model Selection for Bayesian Autoencoders

no code implementations NeurIPS 2021 Ba-Hien Tran, Simone Rossi, Dimitrios Milios, Pietro Michiardi, Edwin V. Bonilla, Maurizio Filippone

We develop a novel method for carrying out model selection for Bayesian autoencoders (BAEs) by means of prior hyper-parameter optimization.

Model Selection Representation Learning

SigGPDE: Scaling Sparse Gaussian Processes on Sequential Data

no code implementations10 May 2021 Maud Lemercier, Cristopher Salvi, Thomas Cass, Edwin V. Bonilla, Theodoros Damoulas, Terry Lyons

Making predictions and quantifying their uncertainty when the input data is sequential is a fundamental learning challenge, recently attracting increasing attention.

Gaussian Processes Time Series +1

BORE: Bayesian Optimization by Density-Ratio Estimation

1 code implementation17 Feb 2021 Louis C. Tiao, Aaron Klein, Matthias Seeger, Edwin V. Bonilla, Cedric Archambeau, Fabio Ramos

Bayesian optimization (BO) is among the most effective and widely-used blackbox optimization methods.

Density Ratio Estimation

Distribution Regression for Sequential Data

no code implementations10 Jun 2020 Maud Lemercier, Cristopher Salvi, Theodoros Damoulas, Edwin V. Bonilla, Terry Lyons

In this paper, we develop a rigorous mathematical framework for distribution regression where inputs are complex data streams.

Time Series

Sparse Gaussian Processes Revisited: Bayesian Approaches to Inducing-Variable Approximations

no code implementations6 Mar 2020 Simone Rossi, Markus Heinonen, Edwin V. Bonilla, Zheyang Shen, Maurizio Filippone

Variational inference techniques based on inducing variables provide an elegant framework for scalable posterior estimation in Gaussian process (GP) models.

Gaussian Processes Variational Inference

Quantile Propagation for Wasserstein-Approximate Gaussian Processes

1 code implementation NeurIPS 2020 Rui Zhang, Christian J. Walder, Edwin V. Bonilla, Marian-Andrei Rizoiu, Lexing Xie

We show that QP matches quantile functions rather than moments as in EP and has the same mean update but a smaller variance update than EP, thereby alleviating EP's tendency to over-estimate posterior variances.

Bayesian Inference Gaussian Processes

Structured Variational Inference in Continuous Cox Process Models

1 code implementation NeurIPS 2019 Virginia Aglietti, Edwin V. Bonilla, Theodoros Damoulas, Sally Cripps

We propose a scalable framework for inference in an inhomogeneous Poisson process modeled by a continuous sigmoidal Cox process that assumes the corresponding intensity function is given by a Gaussian process (GP) prior transformed with a scaled logistic sigmoid function.

Numerical Integration Variational Inference

Variational Inference for Graph Convolutional Networks in the Absence of Graph Data and Adversarial Settings

1 code implementation NeurIPS 2020 Pantelis Elinas, Edwin V. Bonilla, Louis Tiao

We propose a framework that lifts the capabilities of graph convolutional networks (GCNs) to scenarios where no input graph is given and increases their robustness to adversarial attacks.

Bayesian Inference General Classification +1

Scalable Grouped Gaussian Processes via Direct Cholesky Functional Representations

no code implementations10 Mar 2019 Astrid Dahl, Edwin V. Bonilla

We consider multi-task regression models where observations are assumed to be a linear combination of several latent node and weight functions, all drawn from Gaussian process (GP) priors that allow nonzero covariance between grouped latent functions.

Gaussian Processes Variational Inference

Grouped Gaussian Processes for Solar Power Prediction

no code implementations7 Jun 2018 Astrid Dahl, Edwin V. Bonilla

We consider multi-task regression models where the observations are assumed to be a linear combination of several latent node functions and weight functions, which are both drawn from Gaussian process priors.

Gaussian Processes

Cycle-Consistent Adversarial Learning as Approximate Bayesian Inference

no code implementations5 Jun 2018 Louis C. Tiao, Edwin V. Bonilla, Fabio Ramos

We formalize the problem of learning interdomain correspondences in the absence of paired data as Bayesian inference in a latent variable model (LVM), where one seeks the underlying hidden representations of entities from one domain as entities from the other domain.

Bayesian Inference Variational Inference

Calibrating Deep Convolutional Gaussian Processes

1 code implementation26 May 2018 Gia-Lac Tran, Edwin V. Bonilla, John P. Cunningham, Pietro Michiardi, Maurizio Filippone

The wide adoption of Convolutional Neural Networks (CNNs) in applications where decision-making under uncertainty is fundamental, has brought a great deal of attention to the ability of these models to accurately quantify the uncertainty in their predictions.

Decision Making Decision Making Under Uncertainty +3

Semi-parametric Network Structure Discovery Models

no code implementations27 Feb 2017 Amir Dezfouli, Edwin V. Bonilla, Richard Nock

We propose a network structure discovery model for continuous observations that generalizes linear causal models by incorporating a Gaussian process (GP) prior on a network-independent component, and random sparsity and weight matrices as the network-dependent parameters.

Variational Inference

AutoGP: Exploring the Capabilities and Limitations of Gaussian Process Models

no code implementations18 Oct 2016 Karl Krauth, Edwin V. Bonilla, Kurt Cutajar, Maurizio Filippone

We investigate the capabilities and limitations of Gaussian process models by jointly exploring three complementary directions: (i) scalable and statistically efficient inference; (ii) flexible kernels; and (iii) objective functions for hyperparameter learning alternative to the marginal likelihood.

General Classification

Random Feature Expansions for Deep Gaussian Processes

1 code implementation ICML 2017 Kurt Cutajar, Edwin V. Bonilla, Pietro Michiardi, Maurizio Filippone

The composition of multiple Gaussian Processes as a Deep Gaussian Process (DGP) enables a deep probabilistic nonparametric approach to flexibly tackle complex machine learning problems with sound quantification of uncertainty.

Gaussian Processes Variational Inference

Gray-box inference for structured Gaussian process models

no code implementations14 Sep 2016 Pietro Galliani, Amir Dezfouli, Edwin V. Bonilla, Novi Quadrianto

We develop an automated variational inference method for Bayesian structured prediction problems with Gaussian process (GP) priors and linear-chain likelihoods.

Stochastic Optimization Structured Prediction +1

Generic Inference in Latent Gaussian Process Models

1 code implementation2 Sep 2016 Edwin V. Bonilla, Karl Krauth, Amir Dezfouli

We evaluate our approach quantitatively and qualitatively with experiments on small datasets, medium-scale datasets and large datasets, showing its competitiveness under different likelihood models and sparsity levels.

General Classification Stochastic Optimization

Scalable Inference for Gaussian Process Models with Black-Box Likelihoods

no code implementations NeurIPS 2015 Amir Dezfouli, Edwin V. Bonilla

We propose a sparse method for scalable automated variational inference (AVI) in a large class of models with Gaussian process (GP) priors, multiple latent functions, multiple outputs and non-linear likelihoods.

General Classification Variational Inference

Extended and Unscented Gaussian Processes

no code implementations NeurIPS 2014 Daniel M. Steinberg, Edwin V. Bonilla

We present two new methods for inference in Gaussian process (GP) models with general nonlinear likelihoods.

Gaussian Processes General Classification

Automated Variational Inference for Gaussian Process Models

no code implementations NeurIPS 2014 Trung V. Nguyen, Edwin V. Bonilla

Using a mixture of Gaussians as the variational distribution, we show that (i) the variational objective and its gradients can be approximated efficiently via sampling from univariate Gaussian distributions and (ii) the gradients of the GP hyperparameters can be obtained analytically regardless of the model likelihood.

Variational Inference

Gaussian Process Preference Elicitation

no code implementations NeurIPS 2010 Shengbo Guo, Scott Sanner, Edwin V. Bonilla

Bayesian approaches to preference elicitation (PE) are particularly attractive due to their ability to explicitly model uncertainty in users' latent utility functions.

Cannot find the paper you are looking for? You can Submit a new open access paper.