Search Results for author: Edwin V. Bonilla

Found 32 papers, 12 papers with code

Optimal Transport for Structure Learning Under Missing Data

1 code implementation23 Feb 2024 Vy Vo, He Zhao, Trung Le, Edwin V. Bonilla, Dinh Phung

Merely filling in missing values with existing imputation methods and subsequently applying structure learning on the complete data is empirical shown to be sub-optimal.

Causal Discovery Imputation

Bayesian Factorised Granger-Causal Graphs For Multivariate Time-series Data

no code implementations6 Feb 2024 He Zhao, Edwin V. Bonilla

We study the problem of automatically discovering Granger causal relations from observational multivariate time-series data.

Time Series Uncertainty Quantification

Variational DAG Estimation via State Augmentation With Stochastic Permutations

no code implementations4 Feb 2024 Edwin V. Bonilla, Pantelis Elinas, He Zhao, Maurizio Filippone, Vassili Kitsios, Terry O'Kane

Estimating the structure of a Bayesian network, in the form of a directed acyclic graph (DAG), from observational data is a statistically and computationally hard problem with essential applications in areas such as causal discovery.

Causal Discovery Uncertainty Quantification +1

Contextual Directed Acyclic Graphs

1 code implementation24 Oct 2023 Ryan Thompson, Edwin V. Bonilla, Robert Kohn

Estimating the structure of directed acyclic graphs (DAGs) from observational data remains a significant challenge in machine learning.

Statistically Efficient Bayesian Sequential Experiment Design via Reinforcement Learning with Cross-Entropy Estimators

no code implementations29 May 2023 Tom Blau, Iadine Chades, Amir Dezfouli, Daniel Steinberg, Edwin V. Bonilla

We propose the use of an alternative estimator based on the cross-entropy of the joint model distribution and a flexible proposal distribution.

reinforcement-learning

Free-Form Variational Inference for Gaussian Process State-Space Models

1 code implementation20 Feb 2023 Xuhui Fan, Edwin V. Bonilla, Terence J. O'Kane, Scott A. Sisson

However, inference in GPSSMs is computationally and statistically challenging due to the large number of latent variables in the model and the strong temporal dependencies between them.

Variational Inference

Recurrent Neural Networks and Universal Approximation of Bayesian Filters

no code implementations1 Nov 2022 Adrian N. Bishop, Edwin V. Bonilla

We consider the Bayesian optimal filtering problem: i. e. estimating some conditional statistics of a latent time-series signal from an observation sequence.

Time Series Time Series Analysis

Addressing Over-Smoothing in Graph Neural Networks via Deep Supervision

no code implementations25 Feb 2022 Pantelis Elinas, Edwin V. Bonilla

Learning useful node and graph representations with graph neural networks (GNNs) is a challenging task.

Graph Property Prediction Property Prediction

Optimizing Sequential Experimental Design with Deep Reinforcement Learning

1 code implementation2 Feb 2022 Tom Blau, Edwin V. Bonilla, Iadine Chades, Amir Dezfouli

Bayesian approaches developed to solve the optimal design of sequential experiments are mathematically elegant but computationally challenging.

Experimental Design reinforcement-learning +1

Learning ODEs via Diffeomorphisms for Fast and Robust Integration

no code implementations4 Jul 2021 Weiming Zhi, Tin Lai, Lionel Ott, Edwin V. Bonilla, Fabio Ramos

Advances in differentiable numerical integrators have enabled the use of gradient descent techniques to learn ordinary differential equations (ODEs).

Model Selection for Bayesian Autoencoders

1 code implementation NeurIPS 2021 Ba-Hien Tran, Simone Rossi, Dimitrios Milios, Pietro Michiardi, Edwin V. Bonilla, Maurizio Filippone

We develop a novel method for carrying out model selection for Bayesian autoencoders (BAEs) by means of prior hyper-parameter optimization.

Model Selection Representation Learning

SigGPDE: Scaling Sparse Gaussian Processes on Sequential Data

no code implementations10 May 2021 Maud Lemercier, Cristopher Salvi, Thomas Cass, Edwin V. Bonilla, Theodoros Damoulas, Terry Lyons

Making predictions and quantifying their uncertainty when the input data is sequential is a fundamental learning challenge, recently attracting increasing attention.

Gaussian Processes Time Series +2

Distribution Regression for Sequential Data

no code implementations10 Jun 2020 Maud Lemercier, Cristopher Salvi, Theodoros Damoulas, Edwin V. Bonilla, Terry Lyons

In this paper, we develop a rigorous mathematical framework for distribution regression where inputs are complex data streams.

regression Time Series +1

Sparse Gaussian Processes Revisited: Bayesian Approaches to Inducing-Variable Approximations

no code implementations6 Mar 2020 Simone Rossi, Markus Heinonen, Edwin V. Bonilla, Zheyang Shen, Maurizio Filippone

Variational inference techniques based on inducing variables provide an elegant framework for scalable posterior estimation in Gaussian process (GP) models.

Gaussian Processes Variational Inference

Quantile Propagation for Wasserstein-Approximate Gaussian Processes

1 code implementation NeurIPS 2020 Rui Zhang, Christian J. Walder, Edwin V. Bonilla, Marian-Andrei Rizoiu, Lexing Xie

We show that QP matches quantile functions rather than moments as in EP and has the same mean update but a smaller variance update than EP, thereby alleviating EP's tendency to over-estimate posterior variances.

Bayesian Inference Gaussian Processes

Structured Variational Inference in Continuous Cox Process Models

1 code implementation NeurIPS 2019 Virginia Aglietti, Edwin V. Bonilla, Theodoros Damoulas, Sally Cripps

We propose a scalable framework for inference in an inhomogeneous Poisson process modeled by a continuous sigmoidal Cox process that assumes the corresponding intensity function is given by a Gaussian process (GP) prior transformed with a scaled logistic sigmoid function.

Numerical Integration Uncertainty Quantification +1

Variational Inference for Graph Convolutional Networks in the Absence of Graph Data and Adversarial Settings

1 code implementation NeurIPS 2020 Pantelis Elinas, Edwin V. Bonilla, Louis Tiao

We propose a framework that lifts the capabilities of graph convolutional networks (GCNs) to scenarios where no input graph is given and increases their robustness to adversarial attacks.

Bayesian Inference General Classification +1

Scalable Grouped Gaussian Processes via Direct Cholesky Functional Representations

no code implementations10 Mar 2019 Astrid Dahl, Edwin V. Bonilla

We consider multi-task regression models where observations are assumed to be a linear combination of several latent node and weight functions, all drawn from Gaussian process (GP) priors that allow nonzero covariance between grouped latent functions.

Gaussian Processes Variational Inference

Grouped Gaussian Processes for Solar Power Prediction

no code implementations7 Jun 2018 Astrid Dahl, Edwin V. Bonilla

We consider multi-task regression models where the observations are assumed to be a linear combination of several latent node functions and weight functions, which are both drawn from Gaussian process priors.

Gaussian Processes

Cycle-Consistent Adversarial Learning as Approximate Bayesian Inference

no code implementations5 Jun 2018 Louis C. Tiao, Edwin V. Bonilla, Fabio Ramos

We formalize the problem of learning interdomain correspondences in the absence of paired data as Bayesian inference in a latent variable model (LVM), where one seeks the underlying hidden representations of entities from one domain as entities from the other domain.

Bayesian Inference Variational Inference

Calibrating Deep Convolutional Gaussian Processes

1 code implementation26 May 2018 Gia-Lac Tran, Edwin V. Bonilla, John P. Cunningham, Pietro Michiardi, Maurizio Filippone

The wide adoption of Convolutional Neural Networks (CNNs) in applications where decision-making under uncertainty is fundamental, has brought a great deal of attention to the ability of these models to accurately quantify the uncertainty in their predictions.

Decision Making Decision Making Under Uncertainty +3

Semi-parametric Network Structure Discovery Models

no code implementations27 Feb 2017 Amir Dezfouli, Edwin V. Bonilla, Richard Nock

We propose a network structure discovery model for continuous observations that generalizes linear causal models by incorporating a Gaussian process (GP) prior on a network-independent component, and random sparsity and weight matrices as the network-dependent parameters.

Uncertainty Quantification Variational Inference

AutoGP: Exploring the Capabilities and Limitations of Gaussian Process Models

no code implementations18 Oct 2016 Karl Krauth, Edwin V. Bonilla, Kurt Cutajar, Maurizio Filippone

We investigate the capabilities and limitations of Gaussian process models by jointly exploring three complementary directions: (i) scalable and statistically efficient inference; (ii) flexible kernels; and (iii) objective functions for hyperparameter learning alternative to the marginal likelihood.

General Classification

Random Feature Expansions for Deep Gaussian Processes

1 code implementation ICML 2017 Kurt Cutajar, Edwin V. Bonilla, Pietro Michiardi, Maurizio Filippone

The composition of multiple Gaussian Processes as a Deep Gaussian Process (DGP) enables a deep probabilistic nonparametric approach to flexibly tackle complex machine learning problems with sound quantification of uncertainty.

Gaussian Processes Variational Inference

Gray-box inference for structured Gaussian process models

no code implementations14 Sep 2016 Pietro Galliani, Amir Dezfouli, Edwin V. Bonilla, Novi Quadrianto

We develop an automated variational inference method for Bayesian structured prediction problems with Gaussian process (GP) priors and linear-chain likelihoods.

Stochastic Optimization Structured Prediction +1

Generic Inference in Latent Gaussian Process Models

1 code implementation2 Sep 2016 Edwin V. Bonilla, Karl Krauth, Amir Dezfouli

We evaluate our approach quantitatively and qualitatively with experiments on small datasets, medium-scale datasets and large datasets, showing its competitiveness under different likelihood models and sparsity levels.

General Classification Stochastic Optimization

Scalable Inference for Gaussian Process Models with Black-Box Likelihoods

no code implementations NeurIPS 2015 Amir Dezfouli, Edwin V. Bonilla

We propose a sparse method for scalable automated variational inference (AVI) in a large class of models with Gaussian process (GP) priors, multiple latent functions, multiple outputs and non-linear likelihoods.

General Classification regression +1

Automated Variational Inference for Gaussian Process Models

no code implementations NeurIPS 2014 Trung V. Nguyen, Edwin V. Bonilla

Using a mixture of Gaussians as the variational distribution, we show that (i) the variational objective and its gradients can be approximated efficiently via sampling from univariate Gaussian distributions and (ii) the gradients of the GP hyperparameters can be obtained analytically regardless of the model likelihood.

Variational Inference

Extended and Unscented Gaussian Processes

no code implementations NeurIPS 2014 Daniel M. Steinberg, Edwin V. Bonilla

We present two new methods for inference in Gaussian process (GP) models with general nonlinear likelihoods.

Binary Classification Gaussian Processes +1

Gaussian Process Preference Elicitation

no code implementations NeurIPS 2010 Shengbo Guo, Scott Sanner, Edwin V. Bonilla

Bayesian approaches to preference elicitation (PE) are particularly attractive due to their ability to explicitly model uncertainty in users' latent utility functions.

Cannot find the paper you are looking for? You can Submit a new open access paper.