no code implementations • 26 Sep 2023 • Martin Jankowiak, Du Phan
To expand the space of flexible variational families, we revisit Variational Rejection Sampling (VRS) [Grover et al., 2018], which combines a parametric proposal distribution with rejection sampling to define a rich non-parametric family of distributions that explicitly utilizes the known target distribution.
2 code implementations • 2 Aug 2022 • Martin Jankowiak
Bayesian variable selection is a powerful tool for data analysis, as it offers a principled method for variable selection that accounts for prior information and uncertainty.
no code implementations • 22 Dec 2021 • Martin Jankowiak, Du Phan
Variational inference is a powerful paradigm for approximate Bayesian inference with a number of appealing properties, including support for model learning and data subsampling.
1 code implementation • 28 Jun 2021 • Martin Jankowiak
Bayesian variable selection is a powerful tool for data analysis, as it offers a principled method for variable selection that accounts for prior information and uncertainty.
no code implementations • 24 May 2021 • Martin Jankowiak, Geoff Pleiss
We introduce a simple and scalable method for training Gaussian process (GP) models that exploits cross-validation and nearest neighbor truncation.
2 code implementations • 27 Feb 2021 • David Eriksson, Martin Jankowiak
Bayesian optimization (BO) is a powerful paradigm for efficient optimization of black-box objective functions.
1 code implementation • NeurIPS 2020 • Geoff Pleiss, Martin Jankowiak, David Eriksson, Anil Damle, Jacob R. Gardner
Matrix square roots and their inverses arise frequently in machine learning, e. g., when sampling from high-dimensional Gaussians $\mathcal{N}(\mathbf 0, \mathbf K)$ or whitening a vector $\mathbf b$ against covariance matrix $\mathbf K$.
no code implementations • 21 Feb 2020 • Martin Jankowiak, Geoff Pleiss, Jacob R. Gardner
We introduce Deep Sigma Point Processes, a class of parametric models inspired by the compositional structure of Deep Gaussian Processes (DGPs).
3 code implementations • 24 Dec 2019 • Du Phan, Neeraj Pradhan, Martin Jankowiak
NumPyro is a lightweight library that provides an alternate NumPy backend to the Pyro probabilistic programming language with the same modeling interface, language primitives and effect handling abstractions.
1 code implementation • 1 Nov 2019 • Adam Foster, Martin Jankowiak, Matthew O'Meara, Yee Whye Teh, Tom Rainforth
We introduce a fully stochastic gradient based approach to Bayesian optimal experimental design (BOED).
1 code implementation • 23 Oct 2019 • Fritz Obermeyer, Eli Bingham, Martin Jankowiak, Du Phan, Jonathan P. Chen
It is a significant challenge to design probabilistic programming systems that can accommodate a wide variety of inference strategies within a unified framework.
no code implementations • ICML 2020 • Martin Jankowiak, Geoff Pleiss, Jacob R. Gardner
In an extensive empirical comparison with a number of alternative methods for scalable GP regression, we find that the resulting predictive distributions exhibit significantly better calibrated uncertainties and higher log likelihoods--often by as much as half a nat per datapoint.
no code implementations • 31 May 2019 • Martin Jankowiak, Jacob Gardner
We construct flexible likelihoods for multi-output Gaussian process models that leverage neural networks as components.
1 code implementation • NeurIPS 2019 • Adam Foster, Martin Jankowiak, Eli Bingham, Paul Horsfall, Yee Whye Teh, Tom Rainforth, Noah Goodman
Bayesian optimal experimental design (BOED) is a principled framework for making efficient use of limited experimental resources.
no code implementations • 8 Feb 2019 • Fritz Obermeyer, Eli Bingham, Martin Jankowiak, Justin Chiu, Neeraj Pradhan, Alexander Rush, Noah Goodman
To exploit efficient tensor algebra in graphs with plates of variables, we generalize undirected factor graphs to plated factor graphs and variable elimination to a tensor variable elimination algorithm that operates directly on plated factor graphs.
no code implementations • 2 Nov 2018 • Martin Jankowiak
In this note we consider setups in which variational objectives for Bayesian neural networks can be computed in closed form.
1 code implementation • 18 Oct 2018 • Eli Bingham, Jonathan P. Chen, Martin Jankowiak, Fritz Obermeyer, Neeraj Pradhan, Theofanis Karaletsos, Rohit Singh, Paul Szerlip, Paul Horsfall, Noah D. Goodman
Pyro is a probabilistic programming language built on Python as a platform for developing advanced probabilistic models in AI research.
no code implementations • ICML 2018 • Martin Jankowiak, Fritz Obermeyer
We observe that gradients computed via the reparameterization trick are in direct correspondence with solutions of the transport equation in the formalism of optimal transport.
no code implementations • 5 Jun 2018 • Martin Jankowiak, Theofanis Karaletsos
We exploit the link between the transport equation and derivatives of expectations to construct efficient pathwise gradient estimators for multivariate distributions.
no code implementations • LREC 2014 • Kevin Reschke, Martin Jankowiak, Mihai Surdeanu, Christopher Manning, Daniel Jurafsky
We present a new publicly available dataset and event extraction task in the plane crash domain based on Wikipedia infoboxes and newswire text.