Search Results for author: Dieterich Lawson

Found 10 papers, 6 papers with code

Evaluating Predictive Distributions: Does Bayesian Deep Learning Work?

1 code implementation9 Oct 2021 Ian Osband, Zheng Wen, Seyed Mohammad Asghari, Vikranth Dwaracherla, Botao Hao, Morteza Ibrahimi, Dieterich Lawson, Xiuyuan Lu, Brendan O'Donoghue, Benjamin Van Roy

This paper introduces \textit{The Neural Testbed}, which provides tools for the systematic evaluation of agents that generate such predictions.

Energy-Inspired Models: Learning with Sampler-Induced Distributions

1 code implementation NeurIPS 2019 Dieterich Lawson, George Tucker, Bo Dai, Rajesh Ranganath

Motivated by this, we consider the sampler-induced distribution as the model of interest and maximize the likelihood of this model.

Variational Inference

Doubly Reparameterized Gradient Estimators for Monte Carlo Objectives

3 code implementations ICLR 2019 George Tucker, Dieterich Lawson, Shixiang Gu, Chris J. Maddison

Burda et al. (2015) introduced a multi-sample variational bound, IWAE, that is at least as tight as the standard variational lower bound and becomes increasingly tight as the number of samples increases.

Latent Variable Models Variational Inference

An online sequence-to-sequence model for noisy speech recognition

no code implementations16 Jun 2017 Chung-Cheng Chiu, Dieterich Lawson, Yuping Luo, George Tucker, Kevin Swersky, Ilya Sutskever, Navdeep Jaitly

This is because the models require that the entirety of the input sequence be available at the beginning of inference, an assumption that is not valid for instantaneous speech recognition.

Noisy Speech Recognition

Filtering Variational Objectives

3 code implementations NeurIPS 2017 Chris J. Maddison, Dieterich Lawson, George Tucker, Nicolas Heess, Mohammad Norouzi, andriy mnih, Arnaud Doucet, Yee Whye Teh

When used as a surrogate objective for maximum likelihood estimation in latent variable models, the evidence lower bound (ELBO) produces state-of-the-art results.

Latent Variable Models

Learning Hard Alignments with Variational Inference

no code implementations16 May 2017 Dieterich Lawson, Chung-Cheng Chiu, George Tucker, Colin Raffel, Kevin Swersky, Navdeep Jaitly

There has recently been significant interest in hard attention models for tasks such as object recognition, visual captioning and speech recognition.

Image Captioning Object Recognition +3

Particle Value Functions

no code implementations16 Mar 2017 Chris J. Maddison, Dieterich Lawson, George Tucker, Nicolas Heess, Arnaud Doucet, andriy mnih, Yee Whye Teh

The policy gradients of the expected return objective can react slowly to rare rewards.

Changing Model Behavior at Test-Time Using Reinforcement Learning

no code implementations24 Feb 2017 Augustus Odena, Dieterich Lawson, Christopher Olah

Machine learning models are often used at test-time subject to constraints and trade-offs not present at training-time.

Translation

Training a Subsampling Mechanism in Expectation

1 code implementation22 Feb 2017 Colin Raffel, Dieterich Lawson

We describe a mechanism for subsampling sequences and show how to compute its expected output so that it can be trained with standard backpropagation.

Cannot find the paper you are looking for? You can Submit a new open access paper.