Search Results for author: Thang D. Bui

Found 14 papers, 6 papers with code

Variational Auto-Regressive Gaussian Processes for Continual Learning

1 code implementation9 Jun 2020 Sanyam Kapoor, Theofanis Karaletsos, Thang D. Bui

Through sequential construction of posteriors on observing data online, Bayes' theorem provides a natural framework for continual learning.

Bayesian Inference Continual Learning +1

Hierarchical Gaussian Process Priors for Bayesian Neural Network Weights

no code implementations NeurIPS 2020 Theofanis Karaletsos, Thang D. Bui

Probabilistic neural networks are typically modeled with independent weight priors, which do not capture weight correlations in the prior and do not provide a parsimonious interface to express properties in function space.

Active Learning

Variational Continual Learning

8 code implementations ICLR 2018 Cuong V. Nguyen, Yingzhen Li, Thang D. Bui, Richard E. Turner

This paper develops variational continual learning (VCL), a simple but general framework for continual learning that fuses online variational inference (VI) and recent advances in Monte Carlo VI for neural networks.

Continual Learning Variational Inference

Streaming Sparse Gaussian Process Approximations

3 code implementations NeurIPS 2017 Thang D. Bui, Cuong V. Nguyen, Richard E. Turner

Sparse pseudo-point approximations for Gaussian process (GP) models provide a suite of methods that support deployment of GPs in the large data regime and enable analytic intractabilities to be sidestepped.

Neural Graph Machines: Learning Neural Networks Using Graphs

no code implementations14 Mar 2017 Thang D. Bui, Sujith Ravi, Vivek Ramavajjala

In this work, we propose a training framework with a graph-regularised objective, namely "Neural Graph Machines", that can combine the power of neural networks and label propagation.

Document Classification General Classification +3

A Unifying Framework for Gaussian Process Pseudo-Point Approximations using Power Expectation Propagation

1 code implementation23 May 2016 Thang D. Bui, Josiah Yan, Richard E. Turner

Unlike much of the previous venerable work in this area, the new framework is built on standard methods for approximate inference (variational free-energy, EP and Power EP methods) rather than employing approximations to the probabilistic generative model itself.

Gaussian Processes

Deep Gaussian Processes for Regression using Approximate Expectation Propagation

no code implementations12 Feb 2016 Thang D. Bui, Daniel Hernández-Lobato, Yingzhen Li, José Miguel Hernández-Lobato, Richard E. Turner

Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (GPs) and are formally equivalent to neural networks with multiple, infinitely wide hidden layers.

Gaussian Processes regression

Learning Stationary Time Series using Gaussian Processes with Nonparametric Kernels

no code implementations NeurIPS 2015 Felipe Tobar, Thang D. Bui, Richard E. Turner

We introduce the Gaussian Process Convolution Model (GPCM), a two-stage nonparametric generative procedure to model stationary signals as the convolution between a continuous-time white-noise process and a continuous-time linear filter drawn from Gaussian process.

Denoising Gaussian Processes +3

Training Deep Gaussian Processes using Stochastic Expectation Propagation and Probabilistic Backpropagation

no code implementations11 Nov 2015 Thang D. Bui, José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato, Richard E. Turner

Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (GPs) and are formally equivalent to neural networks with multiple, infinitely wide hidden layers.

Gaussian Processes

Tree-structured Gaussian Process Approximations

no code implementations NeurIPS 2014 Thang D. Bui, Richard E. Turner

Gaussian process regression can be accelerated by constructing a small pseudo-dataset to summarise the observed data.

Imputation regression +2

Cannot find the paper you are looking for? You can Submit a new open access paper.