Search Results for author: Neill D. F. Campbell

Found 27 papers, 9 papers with code

Likelihood-based Out-of-Distribution Detection with Denoising Diffusion Probabilistic Models

no code implementations26 Oct 2023 Joseph Goodier, Neill D. F. Campbell

We present results that are comparable to state-of-the-art Out-of-Distribution detection methods with generative models.

Denoising Out-of-Distribution Detection

Compressed Sensing MRI Reconstruction Regularized by VAEs with Structured Image Covariance

no code implementations26 Oct 2022 Margaret Duff, Ivor J. A. Simpson, Matthias J. Ehrhardt, Neill D. F. Campbell

The covariance can model changing uncertainty dependencies caused by structure in the image, such as edges or objects, and provides a new distance metric from the manifold of learned images.

MRI Reconstruction

Analysing Training-Data Leakage from Gradients through Linear Systems and Gradient Matching

1 code implementation20 Oct 2022 Cangxiong Chen, Neill D. F. Campbell

As a result, we are able to partially attribute the leakage of the training data in a deep network to its architecture.

Attribute Image Classification

Learning Structured Gaussians to Approximate Deep Ensembles

no code implementations CVPR 2022 Ivor J. A. Simpson, Sara Vicente, Neill D. F. Campbell

Similarly to distillation approaches, our single network is trained to maximise the probability of samples from pre-trained probabilistic models, in this work we use a fixed ensemble of networks.

Monocular Depth Estimation

Understanding Training-Data Leakage from Gradients in Neural Networks for Image Classification

1 code implementation19 Nov 2021 Cangxiong Chen, Neill D. F. Campbell

Based on this formulation, we are able to attribute the potential leakage of the training data in a deep network to its architecture.

Attribute Federated Learning +1

Aligned Multi-Task Gaussian Process

no code implementations29 Oct 2021 Olga Mikheeva, Ieva Kazlauskaite, Adam Hartshorne, Hedvig Kjellström, Carl Henrik Ek, Neill D. F. Campbell

Building on the previous work by Kazlauskaiteet al. [2019], we include a separate monotonic warp of the input data to model temporal misalignment.

Bayesian Inference Gaussian Processes +4

Regularising Inverse Problems with Generative Machine Learning Models

no code implementations22 Jul 2021 Margaret Duff, Neill D. F. Campbell, Matthias J. Ehrhardt

The success of generative regularisers depends on the quality of the generative model and so we propose a set of desired criteria to assess generative models and guide future research.

BIG-bench Machine Learning Deblurring

Black-box density function estimation using recursive partitioning

1 code implementation26 Oct 2020 Erik Bodin, Zhenwen Dai, Neill D. F. Campbell, Carl Henrik Ek

We present a novel approach to Bayesian inference and general Bayesian computation that is defined through a sequential decision loop.

Bayesian Inference

Compositional uncertainty in deep Gaussian processes

1 code implementation17 Sep 2019 Ivan Ustyuzhaninov, Ieva Kazlauskaite, Markus Kaiser, Erik Bodin, Neill D. F. Campbell, Carl Henrik Ek

Similarly, deep Gaussian processes (DGPs) should allow us to compute a posterior distribution of compositions of multiple functions giving rise to the observations.

Bayesian Inference Gaussian Processes +1

Modulating Surrogates for Bayesian Optimization

no code implementations ICML 2020 Erik Bodin, Markus Kaiser, Ieva Kazlauskaite, Zhenwen Dai, Neill D. F. Campbell, Carl Henrik Ek

Bayesian optimization (BO) methods often rely on the assumption that the objective function is well-behaved, but in practice, this is seldom true for real-world objectives even if noise-free observations can be collected.

Bayesian Optimization Gaussian Processes

Monotonic Gaussian Process Flow

1 code implementation30 May 2019 Ivan Ustyuzhaninov, Ieva Kazlauskaite, Carl Henrik Ek, Neill D. F. Campbell

We propose a new framework for imposing monotonicity constraints in a Bayesian nonparametric setting based on numerical solutions of stochastic differential equations.

Gaussian Processes Time Series +1

Gaussian Process Deep Belief Networks: A Smooth Generative Model of Shape with Uncertainty Propagation

1 code implementation13 Dec 2018 Alessandro Di Martino, Erik Bodin, Carl Henrik Ek, Neill D. F. Campbell

The shape of an object is an important characteristic for many vision problems such as segmentation, detection and tracking.

Sequence Alignment with Dirichlet Process Mixtures

no code implementations26 Nov 2018 Ieva Kazlauskaite, Ivan Ustyuzhaninov, Carl Henrik Ek, Neill D. F. Campbell

We present a probabilistic model for unsupervised alignment of high-dimensional time-warped sequences based on the Dirichlet Process Mixture Model (DPMM).

Gaussian Processes

DP-GP-LVM: A Bayesian Non-Parametric Model for Learning Multivariate Dependency Structures

no code implementations12 Jul 2018 Andrew R. Lawrence, Carl Henrik Ek, Neill D. F. Campbell

We present a non-parametric Bayesian latent variable model capable of learning dependency structures across dimensions in a multivariate setting.

Training VAEs Under Structured Residuals

2 code implementations3 Apr 2018 Garoe Dorta, Sara Vicente, Lourdes Agapito, Neill D. F. Campbell, Ivor Simpson

This paper demonstrates a novel scheme to incorporate a structured Gaussian likelihood prediction network within the VAE that allows the residual correlations to be modeled.

Gaussian Process Latent Variable Alignment Learning

1 code implementation7 Mar 2018 Ieva Kazlauskaite, Carl Henrik Ek, Neill D. F. Campbell

We present a model that can automatically learn alignments between high-dimensional data in an unsupervised manner.

Structured Uncertainty Prediction Networks

2 code implementations CVPR 2018 Garoe Dorta, Sara Vicente, Lourdes Agapito, Neill D. F. Campbell, Ivor Simpson

This paper is the first work to propose a network to predict a structured uncertainty distribution for a synthesized image.

Image Denoising

Nonparametric Inference for Auto-Encoding Variational Bayes

no code implementations18 Dec 2017 Erik Bodin, Iman Malik, Carl Henrik Ek, Neill D. F. Campbell

We would like to learn latent representations that are low-dimensional and highly interpretable.

Latent Gaussian Process Regression

no code implementations18 Jul 2017 Erik Bodin, Neill D. F. Campbell, Carl Henrik Ek

We introduce Latent Gaussian Process Regression which is a latent variable extension allowing modelling of non-stationary multi-modal processes using GPs.

regression

Direct, Dense, and Deformable: Template-Based Non-Rigid 3D Reconstruction From RGB Video

no code implementations ICCV 2015 Rui Yu, Chris Russell, Neill D. F. Campbell, Lourdes Agapito

In contrast, our method makes use of a single RGB video as input; it can capture the deformations of generic shapes; and the depth estimation is dense, per-pixel and direct.

3D Reconstruction Depth Estimation +1

Modeling Object Appearance Using Context-Conditioned Component Analysis

no code implementations CVPR 2015 Daniyar Turmukhambetov, Neill D. F. Campbell, Simon J. D. Prince, Jan Kautz

In this work we remove the image space alignment limitations of existing subspace models by conditioning the models on a shape dependent context that allows for the complex, non-linear structure of the appearance of the visual object to be captured and shared.

Object

Hierarchical Subquery Evaluation for Active Learning on a Graph

no code implementations CVPR 2014 Oisin Mac Aodha, Neill D. F. Campbell, Jan Kautz, Gabriel J. Brostow

Under some specific circumstances, Expected Error Reduction has been one of the strongest-performing informativeness criteria for active learning.

Active Learning graph construction +1

Cannot find the paper you are looking for? You can Submit a new open access paper.