Search Results for author: Matthew D. Hoffman

Found 28 papers, 11 papers with code

The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo

8 code implementations18 Nov 2011 Matthew D. Hoffman, Andrew Gelman

Hamiltonian Monte Carlo (HMC) is a Markov chain Monte Carlo (MCMC) algorithm that avoids the random walk behavior and sensitivity to correlated parameters that plague many MCMC methods by taking a series of steps informed by first-order gradient information.

A Generative Product-of-Filters Model of Audio

1 code implementation20 Dec 2013 Dawen Liang, Matthew D. Hoffman, Gautham J. Mysore

We propose the product-of-filters (PoF) model, a generative model that decomposes audio spectra as sparse linear combinations of "filters" in the log-spectral domain.

Speaker Identification

Structured Stochastic Variational Inference

no code implementations16 Apr 2014 Matthew D. Hoffman, David M. Blei

Stochastic variational inference makes it possible to approximate posterior distributions induced by large datasets quickly using stochastic optimization.

Stochastic Optimization Variational Inference

A trust-region method for stochastic variational inference with applications to streaming data

no code implementations28 May 2015 Lucas Theis, Matthew D. Hoffman

However, the algorithm is prone to local optima which can make the quality of the posterior approximation sensitive to the choice of hyperparameters and initialization.

Variational Inference

The Stan Math Library: Reverse-Mode Automatic Differentiation in C++

1 code implementation23 Sep 2015 Bob Carpenter, Matthew D. Hoffman, Marcus Brubaker, Daniel Lee, Peter Li, Michael Betancourt

As computational challenges in optimization and statistical inference grow ever harder, algorithms that utilize derivatives are becoming increasingly more important.

Mathematical Software G.1.0; G.1.3; G.1.4; F.2.1

A Variational Analysis of Stochastic Gradient Algorithms

no code implementations8 Feb 2016 Stephan Mandt, Matthew D. Hoffman, David M. Blei

With constant learning rates, it is a stochastic process that, after an initial phase of convergence, generates samples from a stationary distribution.

Variational Inference

Deep Probabilistic Programming

no code implementations13 Jan 2017 Dustin Tran, Matthew D. Hoffman, Rif A. Saurous, Eugene Brevdo, Kevin Murphy, David M. Blei

By treating inference as a first class citizen, on a par with modeling, we show that probabilistic programming can be as flexible and computationally efficient as traditional deep learning.

Probabilistic Programming Variational Inference

Stochastic Gradient Descent as Approximate Bayesian Inference

1 code implementation13 Apr 2017 Stephan Mandt, Matthew D. Hoffman, David M. Blei

Specifically, we show how to adjust the tuning parameters of constant SGD to best match the stationary distribution to a posterior, minimizing the Kullback-Leibler divergence between these two distributions.

Bayesian Inference

Multimodal Prediction and Personalization of Photo Edits with Deep Generative Models

no code implementations17 Apr 2017 Ardavan Saeedi, Matthew D. Hoffman, Stephen J. DiVerdi, Asma Ghandeharioun, Matthew J. Johnson, Ryan P. Adams

Professional-grade software applications are powerful but complicated$-$expert users can achieve impressive results, but novices often struggle to complete even basic tasks.

Learning Deep Latent Gaussian Models with Markov Chain Monte Carlo

no code implementations ICML 2017 Matthew D. Hoffman

Deep latent Gaussian models are powerful and popular probabilistic models of high-dimensional data.

Generalizing Hamiltonian Monte Carlo with Neural Networks

2 code implementations ICLR 2018 Daniel Levy, Matthew D. Hoffman, Jascha Sohl-Dickstein

We present a general-purpose method to train Markov chain Monte Carlo kernels, parameterized by deep neural networks, that converge and mix quickly to their target distribution.

Variational Autoencoders for Collaborative Filtering

18 code implementations16 Feb 2018 Dawen Liang, Rahul G. Krishnan, Matthew D. Hoffman, Tony Jebara

This non-linear probabilistic model enables us to go beyond the limited modeling capacity of linear factor models which still largely dominate collaborative filtering research. We introduce a generative model with multinomial likelihood and use Bayesian inference for parameter estimation.

Bayesian Inference Collaborative Filtering +2

Music Transformer

12 code implementations ICLR 2019 Cheng-Zhi Anna Huang, Ashish Vaswani, Jakob Uszkoreit, Noam Shazeer, Ian Simon, Curtis Hawthorne, Andrew M. Dai, Matthew D. Hoffman, Monica Dinculescu, Douglas Eck

This is impractical for long sequences such as musical compositions since their memory complexity for intermediate relative information is quadratic in the sequence length.

Music Generation Music Modeling

The LORACs prior for VAEs: Letting the Trees Speak for the Data

no code implementations16 Oct 2018 Sharad Vikram, Matthew D. Hoffman, Matthew J. Johnson

In variational autoencoders, the prior on the latent codes $z$ is often treated as an afterthought, but the prior shapes the kind of latent representation that the model learns.

Clustering

Autoconj: Recognizing and Exploiting Conjugacy Without a Domain-Specific Language

2 code implementations NeurIPS 2018 Matthew D. Hoffman, Matthew J. Johnson, Dustin Tran

Deriving conditional and marginal distributions using conjugacy relationships can be time consuming and error prone.

Automatic Reparameterisation of Probabilistic Programs

1 code implementation ICML 2020 Maria I. Gorinova, Dave Moore, Matthew D. Hoffman

Probabilistic programming has emerged as a powerful paradigm in statistics, applied science, and machine learning: by decoupling modelling from inference, it promises to allow modellers to directly reason about the processes generating data.

Probabilistic Programming

Langevin Dynamics as Nonparametric Variational Inference

no code implementations pproximateinference AABI Symposium 2019 Matthew D. Hoffman, Yian Ma

Variational inference (VI) and Markov chain Monte Carlo (MCMC) are approximate posterior inference algorithms that are often said to have complementary strengths, with VI being fast but biased and MCMC being slower but asymptotically unbiased.

Variational Inference

Automatically Batching Control-Intensive Programs for Modern Accelerators

no code implementations23 Oct 2019 Alexey Radul, Brian Patton, Dougal Maclaurin, Matthew D. Hoffman, Rif A. Saurous

We present a general approach to batching arbitrary computations for accelerators such as GPUs.

What Are Bayesian Neural Network Posteriors Really Like?

3 code implementations29 Apr 2021 Pavel Izmailov, Sharad Vikram, Matthew D. Hoffman, Andrew Gordon Wilson

The posterior over Bayesian neural network (BNN) parameters is extremely high-dimensional and non-convex.

Data Augmentation Variational Inference

Lossy Compression with Gaussian Diffusion

no code implementations17 Jun 2022 Lucas Theis, Tim Salimans, Matthew D. Hoffman, Fabian Mentzer

Unlike modern compression schemes which rely on transform coding and quantization to restrict the transmitted information, DiffC relies on the efficient communication of pixels corrupted by Gaussian noise.

Quantization

Sequential Monte Carlo Learning for Time Series Structure Discovery

1 code implementation13 Jul 2023 Feras A. Saad, Brian J. Patton, Matthew D. Hoffman, Rif A. Saurous, Vikash K. Mansinghka

This paper presents a new approach to automatically discovering accurate models of complex time series data.

Time Series

Training Chain-of-Thought via Latent-Variable Inference

no code implementations NeurIPS 2023 Du Phan, Matthew D. Hoffman, David Dohan, Sholto Douglas, Tuan Anh Le, Aaron Parisi, Pavel Sountsov, Charles Sutton, Sharad Vikram, Rif A. Saurous

Large language models (LLMs) solve problems more accurately and interpretably when instructed to work out the answer step by step using a ``chain-of-thought'' (CoT) prompt.

GSM8K

Robust Inverse Graphics via Probabilistic Inference

no code implementations2 Feb 2024 Tuan Anh Le, Pavel Sountsov, Matthew D. Hoffman, Ben Lee, Brian Patton, Rif A. Saurous

How do we infer a 3D scene from a single image in the presence of corruptions like rain, snow or fog?

Cannot find the paper you are looking for? You can Submit a new open access paper.