Search Results for author: Ben Poole

Found 49 papers, 26 papers with code

Categorical Reparameterization with Gumbel-Softmax

19 code implementations3 Nov 2016 Eric Jang, Shixiang Gu, Ben Poole

Categorical variables are a natural choice for representing discrete structure in the world.

General Classification

Weakly Supervised Disentanglement with Guarantees

1 code implementation ICLR 2020 Rui Shu, Yining Chen, Abhishek Kumar, Stefano Ermon, Ben Poole

Learning disentangled representations that correspond to factors of variation in real-world data is critical to interpretable and human-controllable machine learning.

Disentanglement

Autoregressive Diffusion Models

2 code implementations ICLR 2022 Emiel Hoogeboom, Alexey A. Gritsenko, Jasmijn Bastings, Ben Poole, Rianne van den Berg, Tim Salimans

We introduce Autoregressive Diffusion Models (ARDMs), a model class encompassing and generalizing order-agnostic autoregressive models (Uria et al., 2014) and absorbing discrete diffusion (Austin et al., 2021), which we show are special cases of ARDMs under mild assumptions.

Ranked #8 on Image Generation on CIFAR-10 (bits/dimension metric)

Image Generation

Zero-Shot Text-Guided Object Generation with Dream Fields

4 code implementations CVPR 2022 Ajay Jain, Ben Mildenhall, Jonathan T. Barron, Pieter Abbeel, Ben Poole

Our method, Dream Fields, can generate the geometry and color of a wide range of objects without 3D supervision.

Neural Rendering Object

DreamFusion: Text-to-3D using 2D Diffusion

4 code implementations29 Sep 2022 Ben Poole, Ajay Jain, Jonathan T. Barron, Ben Mildenhall

Using this loss in a DeepDream-like procedure, we optimize a randomly-initialized 3D model (a Neural Radiance Field, or NeRF) via gradient descent such that its 2D renderings from random angles achieve a low loss.

Denoising Image Generation +1

What Makes for Good Views for Contrastive Learning?

1 code implementation NeurIPS 2020 Yonglong Tian, Chen Sun, Ben Poole, Dilip Krishnan, Cordelia Schmid, Phillip Isola

Contrastive learning between multiple views of the data has recently achieved state of the art performance in the field of self-supervised representation learning.

Contrastive Learning Data Augmentation +8

Continual Learning Through Synaptic Intelligence

5 code implementations ICML 2017 Friedemann Zenke, Ben Poole, Surya Ganguli

While deep learning has led to remarkable advances across diverse applications, it struggles in domains where the data distribution changes over the course of learning.

Computational Efficiency Continual Learning +1

Score-Based Generative Modeling through Stochastic Differential Equations

10 code implementations ICLR 2021 Yang song, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, Stefano Ermon, Ben Poole

Combined with multiple architectural improvements, we achieve record-breaking performance for unconditional image generation on CIFAR-10 with an Inception score of 9. 89 and FID of 2. 20, a competitive likelihood of 2. 99 bits/dim, and demonstrate high fidelity generation of 1024 x 1024 images for the first time from a score-based generative model.

Colorization Density Estimation +2

Weakly-Supervised Disentanglement Without Compromises

3 code implementations ICML 2020 Francesco Locatello, Ben Poole, Gunnar Rätsch, Bernhard Schölkopf, Olivier Bachem, Michael Tschannen

Third, we perform a large-scale empirical study and show that such pairs of observations are sufficient to reliably learn disentangled representations on several benchmark data sets.

Disentanglement Fairness

VeLO: Training Versatile Learned Optimizers by Scaling Up

1 code implementation17 Nov 2022 Luke Metz, James Harrison, C. Daniel Freeman, Amil Merchant, Lucas Beyer, James Bradbury, Naman Agrawal, Ben Poole, Igor Mordatch, Adam Roberts, Jascha Sohl-Dickstein

While deep learning models have replaced hand-designed features across many domains, these models are still trained with hand-designed optimizers.

Discrete Flows: Invertible Generative Models of Discrete Data

2 code implementations NeurIPS 2019 Dustin Tran, Keyon Vafa, Kumar Krishna Agrawal, Laurent Dinh, Ben Poole

While normalizing flows have led to significant advances in modeling high-dimensional continuous distributions, their applicability to discrete distributions remains unknown.

Language Modelling

Adversarially Learned Inference

9 code implementations2 Jun 2016 Vincent Dumoulin, Ishmael Belghazi, Ben Poole, Olivier Mastropietro, Alex Lamb, Martin Arjovsky, Aaron Courville

We introduce the adversarially learned inference (ALI) model, which jointly learns a generation network and an inference network using an adversarial process.

Image-to-Image Translation

Categorical Reparametrization with Gumbel-Softmax

1 code implementation ICLR 2017 2016 Eric Jang, Shixiang Gu, Ben Poole

Categorical variables are a natural choice for representing discrete structure in the world.

Unrolled Generative Adversarial Networks

9 code implementations7 Nov 2016 Luke Metz, Ben Poole, David Pfau, Jascha Sohl-Dickstein

We introduce a method to stabilize Generative Adversarial Networks (GANs) by defining the generator objective with respect to an unrolled optimization of the discriminator.

On Variational Bounds of Mutual Information

3 code implementations16 May 2019 Ben Poole, Sherjil Ozair, Aaron van den Oord, Alexander A. Alemi, George Tucker

Estimating and optimizing Mutual Information (MI) is core to many problems in machine learning; however, bounding MI in high dimensions is challenging.

Representation Learning

Variational Diffusion Models

4 code implementations1 Jul 2021 Diederik P. Kingma, Tim Salimans, Ben Poole, Jonathan Ho

In addition, we show that the continuous-time VLB is invariant to the noise schedule, except for the signal-to-noise ratio at its endpoints.

Density Estimation Image Generation

On Density Estimation with Diffusion Models

1 code implementation NeurIPS 2021 Diederik Kingma, Tim Salimans, Ben Poole, Jonathan Ho

In addition, we show that the continuous-time VLB is invariant to the noise schedule, except for the signal-to-noise ratio at its endpoints.

Density Estimation

The Fast Bilateral Solver

2 code implementations10 Nov 2015 Jonathan T. Barron, Ben Poole

We present the bilateral solver, a novel algorithm for edge-aware smoothing that combines the flexibility and speed of simple filtering approaches with the accuracy of domain-specific optimization algorithms.

Colorization Semantic Segmentation

Fast large-scale optimization by unifying stochastic gradient and quasi-Newton methods

1 code implementation9 Nov 2013 Jascha Sohl-Dickstein, Ben Poole, Surya Ganguli

This algorithm contrasts with earlier stochastic second order techniques that treat the Hessian of each contributing function as a noisy approximation to the full Hessian, rather than as a target for direct estimation.

Computational Efficiency

Video Interpolation with Diffusion Models

1 code implementation1 Apr 2024 Siddhant Jain, Daniel Watson, Eric Tabellion, Aleksander Hołyński, Ben Poole, Janne Kontkanen

We present VIDIM, a generative model for video interpolation, which creates short videos given a start and end frame.

Super-Resolution

Exponential expressivity in deep neural networks through transient chaos

1 code implementation NeurIPS 2016 Ben Poole, Subhaneil Lahiri, Maithra Raghu, Jascha Sohl-Dickstein, Surya Ganguli

We combine Riemannian geometry with the mean field theory of high dimensional chaos to study the nature of signal propagation in generic, deep neural networks with random weights.

Learning Energy-Based Models by Diffusion Recovery Likelihood

2 code implementations ICLR 2021 Ruiqi Gao, Yang song, Ben Poole, Ying Nian Wu, Diederik P. Kingma

Inspired by recent progress on diffusion probabilistic models, we present a diffusion recovery likelihood method to tractably learn and sample from a sequence of EBMs trained on increasingly noisy versions of a dataset.

Image Generation

Improving Robustness Without Sacrificing Accuracy with Patch Gaussian Augmentation

2 code implementations6 Jun 2019 Raphael Gontijo Lopes, Dong Yin, Ben Poole, Justin Gilmer, Ekin D. Cubuk

Deploying machine learning systems in the real world requires both high accuracy on clean data and robustness to naturally occurring corruptions.

Data Augmentation object-detection +1

Fixing a Broken ELBO

1 code implementation ICML 2018 Alexander A. Alemi, Ben Poole, Ian Fischer, Joshua V. Dillon, Rif A. Saurous, Kevin Murphy

Recent work in unsupervised representation learning has focused on learning deep directed latent-variable models.

Representation Learning

On the Expressive Power of Deep Neural Networks

no code implementations ICML 2017 Maithra Raghu, Ben Poole, Jon Kleinberg, Surya Ganguli, Jascha Sohl-Dickstein

We propose a new approach to the problem of neural network expressivity, which seeks to characterize how structural properties of a neural network family affect the functions it is able to compute.

Improved generator objectives for GANs

no code implementations8 Dec 2016 Ben Poole, Alexander A. Alemi, Jascha Sohl-Dickstein, Anelia Angelova

We present a framework to understand GAN training as alternating density ratio estimation and approximate divergence minimization.

Density Ratio Estimation

Survey of Expressivity in Deep Neural Networks

no code implementations24 Nov 2016 Maithra Raghu, Ben Poole, Jon Kleinberg, Surya Ganguli, Jascha Sohl-Dickstein

This quantity grows exponentially in the depth of the network, and is responsible for the depth sensitivity observed.

Analyzing noise in autoencoders and deep networks

no code implementations6 Jun 2014 Ben Poole, Jascha Sohl-Dickstein, Surya Ganguli

Autoencoders have emerged as a useful framework for unsupervised learning of internal representations, and a wide variety of apparently conceptually disparate regularization techniques have been proposed to generate useful features.

Denoising

An information-theoretic analysis of deep latent-variable models

no code implementations ICLR 2018 Alex Alemi, Ben Poole, Ian Fischer, Josh Dillon, Rif A. Saurus, Kevin Murphy

We present an information-theoretic framework for understanding trade-offs in unsupervised learning of deep latent-variables models using variational inference.

Variational Inference

Preventing Posterior Collapse with delta-VAEs

no code implementations ICLR 2019 Ali Razavi, Aäron van den Oord, Ben Poole, Oriol Vinyals

Due to the phenomenon of "posterior collapse," current latent variable generative models pose a challenging design choice that either weakens the capacity of the decoder or requires augmenting the objective so it does not only maximize the likelihood of the data.

Ranked #7 on Image Generation on ImageNet 32x32 (bpd metric)

Image Generation Representation Learning

On Predictive Information in RNNs

no code implementations21 Oct 2019 Zhe Dong, Deniz Oktay, Ben Poole, Alexander A. Alemi

Certain biological neurons demonstrate a remarkable capability to optimally compress the history of sensory inputs while being maximally informative about the future.

Information Plane

On Implicit Regularization in $β$-VAEs

no code implementations31 Jan 2020 Abhishek Kumar, Ben Poole

While the impact of variational inference (VI) on posterior inference in a fixed generative model is well-characterized, its role in regularizing a learned generative model when used in variational autoencoders (VAEs) is poorly understood.

Variational Inference

Regularized Autoencoders via Relaxed Injective Probability Flow

no code implementations20 Feb 2020 Abhishek Kumar, Ben Poole, Kevin Murphy

Invertible flow-based generative models are an effective method for learning to generate samples, while allowing for tractable likelihood computation and inference.

On Implicit Regularization in $\beta$-VAEs

no code implementations ICML 2020 Abhishek Kumar, Ben Poole

While the impact of variational inference (VI) on posterior inference in a fixed generative model is well-characterized, its role in regularizing a learned generative model when used in variational autoencoders (VAEs) is poorly understood.

Variational Inference

Tasks, stability, architecture, and compute: Training more effective learned optimizers, and using them to train themselves

no code implementations23 Sep 2020 Luke Metz, Niru Maheswaranathan, C. Daniel Freeman, Ben Poole, Jascha Sohl-Dickstein

In this work we focus on general-purpose learned optimizers capable of training a wide variety of problems with no user-specified hyperparameters.

Overcoming barriers to the training of effective learned optimizers

no code implementations1 Jan 2021 Luke Metz, Niru Maheswaranathan, C. Daniel Freeman, Ben Poole, Jascha Sohl-Dickstein

In this work we focus on general-purpose learned optimizers capable of training a wide variety of problems with no user-specified hyperparameters.

Non-saturating GAN training as divergence minimization

no code implementations15 Oct 2020 Matt Shannon, Ben Poole, Soroosh Mariooryad, Tom Bagby, Eric Battenberg, David Kao, Daisy Stanton, RJ Skerry-Ryan

Non-saturating generative adversarial network (GAN) training is widely used and has continued to obtain groundbreaking results.

Generative Adversarial Network

VIB is Half Bayes

no code implementations pproximateinference AABI Symposium 2021 Alexander A Alemi, Warren R Morningstar, Ben Poole, Ian Fischer, Joshua V Dillon

In discriminative settings such as regression and classification there are two random variables at play, the inputs X and the targets Y.

regression

On Predictive Information Sub-optimality of RNNs

no code implementations25 Sep 2019 Zhe Dong, Deniz Oktay, Ben Poole, Alexander A. Alemi

Certain biological neurons demonstrate a remarkable capability to optimally compress the history of sensory inputs while being maximally informative about the future.

Information Plane

Learning a Diffusion Prior for NeRFs

no code implementations27 Apr 2023 Guandao Yang, Abhijit Kundu, Leonidas J. Guibas, Jonathan T. Barron, Ben Poole

Neural Radiance Fields (NeRFs) have emerged as a powerful neural 3D representation for objects and scenes derived from 2D data.

Variational Prediction

no code implementations14 Jul 2023 Alexander A. Alemi, Ben Poole

In this paper, we present variational prediction, a technique for directly learning a variational approximation to the posterior predictive distribution using a variational bound.

Bayesian Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.