Search Results for author: Matthew Ashman

Found 7 papers, 5 papers with code

Amortised Inference in Neural Networks for Small-Scale Probabilistic Meta-Learning

no code implementations24 Oct 2023 Matthew Ashman, Tommy Rochussen, Adrian Weller

The global inducing point variational approximation for BNNs is based on using a set of inducing inputs to construct a series of conditional distributions that accurately approximate the conditionals of the true posterior distribution.

Bayesian Inference Meta-Learning

Causal Reasoning in the Presence of Latent Confounders via Neural ADMG Learning

1 code implementation22 Mar 2023 Matthew Ashman, Chao Ma, Agrin Hilmkil, Joel Jennings, Cheng Zhang

In this work, we further extend the existing body of work and develop a novel gradient-based approach to learning an ADMG with non-linear functional relations from observational data.

Differentially private partitioned variational inference

1 code implementation23 Sep 2022 Mikko A. Heikkilä, Matthew Ashman, Siddharth Swaroop, Richard E. Turner, Antti Honkela

In this paper, we present differentially private partitioned variational inference, the first general framework for learning a variational approximation to a Bayesian posterior distribution in the federated learning setting while minimising the number of communication rounds and providing differential privacy guarantees for data subjects.

Federated Learning Privacy Preserving +1

Do Concept Bottleneck Models Learn as Intended?

no code implementations10 May 2021 Andrei Margeloiu, Matthew Ashman, Umang Bhatt, Yanzhi Chen, Mateja Jamnik, Adrian Weller

Concept bottleneck models map from raw inputs to concepts, and then from concepts to targets.

Scalable Gaussian Process Variational Autoencoders

1 code implementation26 Oct 2020 Metod Jazbec, Matthew Ashman, Vincent Fortuin, Michael Pearce, Stephan Mandt, Gunnar Rätsch

Conventional variational autoencoders fail in modeling correlations between data points due to their use of factorized priors.

Cannot find the paper you are looking for? You can Submit a new open access paper.