Search Results for author: Heiko Strathmann

Found 16 papers, 9 papers with code

Neural Variational Gradient Descent

2 code implementations pproximateinference AABI Symposium 2022 Lauro Langosco di Langosco, Vincent Fortuin, Heiko Strathmann

Particle-based approximate Bayesian inference approaches such as Stein Variational Gradient Descent (SVGD) combine the flexibility and convergence guarantees of sampling methods with the computational benefits of variational inference.

Bayesian Inference Variational Inference

NeRF-VAE: A Geometry Aware 3D Scene Generative Model

no code implementations1 Apr 2021 Adam R. Kosiorek, Heiko Strathmann, Daniel Zoran, Pol Moreno, Rosalia Schneider, Soňa Mokrá, Danilo J. Rezende

We propose NeRF-VAE, a 3D scene generative model that incorporates geometric structure via NeRF and differentiable volume rendering.

Persistent Message Passing

no code implementations ICLR Workshop GTRL 2021 Heiko Strathmann, Mohammadamin Barekatain, Charles Blundell, Petar Veličković

Graph neural networks (GNNs) are a powerful inductive bias for modelling algorithmic reasoning procedures and data structures.

Meta-Learning Mean Functions for Gaussian Processes

no code implementations23 Jan 2019 Vincent Fortuin, Heiko Strathmann, Gunnar Rätsch

When it comes to meta-learning in Gaussian process models, approaches in this setting have mostly focused on learning the kernel function of the prior, but not on learning its mean function.

Gaussian Processes Meta-Learning

Learning deep kernels for exponential family densities

1 code implementation20 Nov 2018 Li Wenliang, Danica J. Sutherland, Heiko Strathmann, Arthur Gretton

The kernel exponential family is a rich class of distributions, which can be fit efficiently and with statistical guarantees by score matching.

Scalable Gaussian Processes on Discrete Domains

no code implementations24 Oct 2018 Vincent Fortuin, Gideon Dresdner, Heiko Strathmann, Gunnar Rätsch

We explore different techniques for selecting inducing points on discrete domains, including greedy selection, determinantal point processes, and simulated annealing.

Gaussian Processes Point Processes

SOM-VAE: Interpretable Discrete Representation Learning on Time Series

6 code implementations ICLR 2019 Vincent Fortuin, Matthias Hüser, Francesco Locatello, Heiko Strathmann, Gunnar Rätsch

We evaluate our model in terms of clustering performance and interpretability on static (Fashion-)MNIST data, a time series of linearly interpolated (Fashion-)MNIST images, a chaotic Lorenz attractor system with two macro states, as well as on a challenging real world medical time series application on the eICU data set.

Dimensionality Reduction Representation Learning +2

Efficient and principled score estimation with Nyström kernel exponential families

1 code implementation23 May 2017 Danica J. Sutherland, Heiko Strathmann, Michael Arbel, Arthur Gretton

We propose a fast method with statistical guarantees for learning an exponential family density model where the natural parameter is in a reproducing kernel Hilbert space, and may be infinite-dimensional.

Denoising Density Estimation

Generative Models and Model Criticism via Optimized Maximum Mean Discrepancy

1 code implementation14 Nov 2016 Danica J. Sutherland, Hsiao-Yu Tung, Heiko Strathmann, Soumyajit De, Aaditya Ramdas, Alex Smola, Arthur Gretton

In this context, the MMD may be used in two roles: first, as a discriminator, either directly on the samples, or on features of the samples.

A Kernel Test of Goodness of Fit

1 code implementation9 Feb 2016 Kacper Chwialkowski, Heiko Strathmann, Arthur Gretton

Our test statistic is based on an empirical estimate of this divergence, taking the form of a V-statistic in terms of the log gradients of the target density and the kernel.

Density Estimation

Kernel Sequential Monte Carlo

1 code implementation11 Oct 2015 Ingmar Schuster, Heiko Strathmann, Brooks Paige, Dino Sejdinovic

As KSMC does not require access to target gradients, it is particularly applicable on targets whose gradients are unknown or prohibitively expensive.

Unbiased Bayes for Big Data: Paths of Partial Posteriors

no code implementations14 Jan 2015 Heiko Strathmann, Dino Sejdinovic, Mark Girolami

A key quantity of interest in Bayesian inference are expectations of functions with respect to a posterior distribution.

Bayesian Inference

Kernel Adaptive Metropolis-Hastings

1 code implementation19 Jul 2013 Dino Sejdinovic, Heiko Strathmann, Maria Lomeli Garcia, Christophe Andrieu, Arthur Gretton

A Kernel Adaptive Metropolis-Hastings algorithm is introduced, for the purpose of sampling from a target distribution with strongly nonlinear support.

On Russian Roulette Estimates for Bayesian Inference with Doubly-Intractable Likelihoods

no code implementations17 Jun 2013 Anne-Marie Lyne, Mark Girolami, Yves Atchadé, Heiko Strathmann, Daniel Simpson

The methodology is reviewed on well-known examples such as the parameters in Ising models, the posterior for Fisher-Bingham distributions on the $d$-Sphere and a large-scale Gaussian Markov Random Field model describing the Ozone Column data.

Bayesian Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.