1 code implementation • 13 Jun 2022 • Dieterich Lawson, Allan Raventós, Andrew Warrington, Scott Linderman
Sequential Monte Carlo (SMC) is an inference algorithm for state space models that approximates the posterior by sampling from a sequence of target distributions.
2 code implementations • 9 Oct 2021 • Ian Osband, Zheng Wen, Seyed Mohammad Asghari, Vikranth Dwaracherla, Botao Hao, Morteza Ibrahimi, Dieterich Lawson, Xiuyuan Lu, Brendan O'Donoghue, Benjamin Van Roy
Predictive distributions quantify uncertainties ignored by point estimates.
no code implementations • 29 Sep 2021 • Ian Osband, Zheng Wen, Seyed Mohammad Asghari, Xiuyuan Lu, Morteza Ibrahimi, Vikranth Dwaracherla, Dieterich Lawson, Brendan O'Donoghue, Botao Hao, Benjamin Van Roy
This paper introduces \textit{The Neural Testbed}, which provides tools for the systematic evaluation of agents that generate such predictions.
1 code implementation • NeurIPS 2019 • Dieterich Lawson, George Tucker, Bo Dai, Rajesh Ranganath
Motivated by this, we consider the sampler-induced distribution as the model of interest and maximize the likelihood of this model.
no code implementations • ICLR Workshop DeepGenStruct 2019 • Dieterich Lawson, George Tucker, Bo Dai, Rajesh Ranganath
The success of enriching the variational family with auxiliary latent variables motivates applying the same techniques to the generative model.
3 code implementations • ICLR 2019 • George Tucker, Dieterich Lawson, Shixiang Gu, Chris J. Maddison
Burda et al. (2015) introduced a multi-sample variational bound, IWAE, that is at least as tight as the standard variational lower bound and becomes increasingly tight as the number of samples increases.
no code implementations • 16 Jun 2017 • Chung-Cheng Chiu, Dieterich Lawson, Yuping Luo, George Tucker, Kevin Swersky, Ilya Sutskever, Navdeep Jaitly
This is because the models require that the entirety of the input sequence be available at the beginning of inference, an assumption that is not valid for instantaneous speech recognition.
3 code implementations • NeurIPS 2017 • Chris J. Maddison, Dieterich Lawson, George Tucker, Nicolas Heess, Mohammad Norouzi, andriy mnih, Arnaud Doucet, Yee Whye Teh
When used as a surrogate objective for maximum likelihood estimation in latent variable models, the evidence lower bound (ELBO) produces state-of-the-art results.
no code implementations • 16 May 2017 • Dieterich Lawson, Chung-Cheng Chiu, George Tucker, Colin Raffel, Kevin Swersky, Navdeep Jaitly
There has recently been significant interest in hard attention models for tasks such as object recognition, visual captioning and speech recognition.
3 code implementations • NeurIPS 2017 • George Tucker, andriy mnih, Chris J. Maddison, Dieterich Lawson, Jascha Sohl-Dickstein
Learning in models with discrete latent variables is challenging due to high variance gradient estimators.
no code implementations • 16 Mar 2017 • Chris J. Maddison, Dieterich Lawson, George Tucker, Nicolas Heess, Arnaud Doucet, andriy mnih, Yee Whye Teh
The policy gradients of the expected return objective can react slowly to rare rewards.
no code implementations • 24 Feb 2017 • Augustus Odena, Dieterich Lawson, Christopher Olah
Machine learning models are often used at test-time subject to constraints and trade-offs not present at training-time.
1 code implementation • 22 Feb 2017 • Colin Raffel, Dieterich Lawson
We describe a mechanism for subsampling sequences and show how to compute its expected output so that it can be trained with standard backpropagation.