Search Results for author: Danilo Jimenez Rezende

Found 25 papers, 14 papers with code

Learning to Induce Causal Structure

no code implementations11 Apr 2022 Nan Rosemary Ke, Silvia Chiappa, Jane Wang, Anirudh Goyal, Jorg Bornschein, Melanie Rey, Theophane Weber, Matthew Botvinic, Michael Mozer, Danilo Jimenez Rezende

The fundamental challenge in causal induction is to infer the underlying graph structure given observational and/or interventional data.

Introduction to Normalizing Flows for Lattice Field Theory

no code implementations20 Jan 2021 Michael S. Albergo, Denis Boyda, Daniel C. Hackett, Gurtej Kanwar, Kyle Cranmer, Sébastien Racanière, Danilo Jimenez Rezende, Phiala E. Shanahan

This notebook tutorial demonstrates a method for sampling Boltzmann distributions of lattice field theories using a class of machine learning models known as normalizing flows.

BIG-bench Machine Learning

Sampling using $SU(N)$ gauge equivariant flows

no code implementations12 Aug 2020 Denis Boyda, Gurtej Kanwar, Sébastien Racanière, Danilo Jimenez Rezende, Michael S. Albergo, Kyle Cranmer, Daniel C. Hackett, Phiala E. Shanahan

We develop a flow-based sampling algorithm for $SU(N)$ lattice gauge theories that is gauge-invariant by construction.

Neural Communication Systems with Bandwidth-limited Channel

no code implementations30 Mar 2020 Karen Ullrich, Fabio Viola, Danilo Jimenez Rezende

Reliably transmitting messages despite information loss due to a noisy channel is a core problem of information theory.

Equivariant flow-based sampling for lattice gauge theory

no code implementations13 Mar 2020 Gurtej Kanwar, Michael S. Albergo, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Sébastien Racanière, Danilo Jimenez Rezende, Phiala E. Shanahan

We define a class of machine-learned flow-based sampling algorithms for lattice gauge theories that are gauge-invariant by construction.

Targeted free energy estimation via learned mappings

no code implementations12 Feb 2020 Peter Wirnsberger, Andrew J. Ballard, George Papamakarios, Stuart Abercrombie, Sébastien Racanière, Alexander Pritzel, Danilo Jimenez Rezende, Charles Blundell

Here, we cast Targeted FEP as a machine learning problem in which the mapping is parameterized as a neural network that is optimized so as to increase overlap.

Normalizing Flows for Probabilistic Modeling and Inference

6 code implementations5 Dec 2019 George Papamakarios, Eric Nalisnick, Danilo Jimenez Rezende, Shakir Mohamed, Balaji Lakshminarayanan

In this review, we attempt to provide such a perspective by describing flows through the lens of probabilistic modeling and inference.

Information bottleneck through variational glasses

no code implementations2 Dec 2019 Slava Voloshynovskiy, Mouad Kondah, Shideh Rezaeifar, Olga Taran, Taras Holotyak, Danilo Jimenez Rezende

In particular, we present a new interpretation of VAE family based on the IB framework using a direct decomposition of mutual information terms and show some interesting connections to existing methods such as VAE [2; 3], beta-VAE [11], AAE [12], InfoVAE [5] and VAE/GAN [13].

Novelty Detection

Equivariant Hamiltonian Flows

no code implementations30 Sep 2019 Danilo Jimenez Rezende, Sébastien Racanière, Irina Higgins, Peter Toth

This paper introduces equivariant hamiltonian flows, a method for learning expressive densities that are invariant with respect to a known Lie-algebra of local symmetry transformations while providing an equivariant representation of the data.

Representation Learning

A Hierarchical Probabilistic U-Net for Modeling Multi-Scale Ambiguities

4 code implementations30 May 2019 Simon A. A. Kohl, Bernardino Romera-Paredes, Klaus H. Maier-Hein, Danilo Jimenez Rezende, S. M. Ali Eslami, Pushmeet Kohli, Andrew Zisserman, Olaf Ronneberger

Medical imaging only indirectly measures the molecular identity of the tissue within each voxel, which often produces only ambiguous image evidence for target measures of interest, like semantic segmentation.

Inductive Bias Instance Segmentation +3

Taming VAEs

3 code implementations1 Oct 2018 Danilo Jimenez Rezende, Fabio Viola

In spite of remarkable progress in deep latent variable generative modeling, training still remains a challenge due to a combination of optimization and generalization issues.

A Probabilistic U-Net for Segmentation of Ambiguous Images

9 code implementations NeurIPS 2018 Simon A. A. Kohl, Bernardino Romera-Paredes, Clemens Meyer, Jeffrey De Fauw, Joseph R. Ledsam, Klaus H. Maier-Hein, S. M. Ali Eslami, Danilo Jimenez Rezende, Olaf Ronneberger

To this end we propose a generative segmentation model based on a combination of a U-Net with a conditional variational autoencoder that is capable of efficiently producing an unlimited number of plausible hypotheses.

Decision Making Segmentation +1

Generative Temporal Models with Spatial Memory for Partially Observed Environments

no code implementations ICML 2018 Marco Fraccaro, Danilo Jimenez Rezende, Yori Zwols, Alexander Pritzel, S. M. Ali Eslami, Fabio Viola

In model-based reinforcement learning, generative and temporal models of environments can be leveraged to boost agent performance, either by tuning the agent's representations during training or via use as part of an explicit planning mechanism.

Model-based Reinforcement Learning

Variational Intrinsic Control

1 code implementation22 Nov 2016 Karol Gregor, Danilo Jimenez Rezende, Daan Wierstra

In this paper we introduce a new unsupervised reinforcement learning method for discovering the set of intrinsic options available to an agent.

Reinforcement Learning (RL) Unsupervised Reinforcement Learning

Towards Conceptual Compression

1 code implementation NeurIPS 2016 Karol Gregor, Frederic Besse, Danilo Jimenez Rezende, Ivo Danihelka, Daan Wierstra

We introduce a simple recurrent variational auto-encoder architecture that significantly improves image modeling.

Ranked #64 on Image Generation on CIFAR-10 (bits/dimension metric)

Image Generation

One-Shot Generalization in Deep Generative Models

no code implementations16 Mar 2016 Danilo Jimenez Rezende, Shakir Mohamed, Ivo Danihelka, Karol Gregor, Daan Wierstra

In particular, humans have an ability for one-shot generalization: an ability to encounter a new concept, understand its structure, and then be able to generate compelling alternative variations of the concept.

BIG-bench Machine Learning Density Estimation +1

Variational Information Maximisation for Intrinsically Motivated Reinforcement Learning

2 code implementations NeurIPS 2015 Shakir Mohamed, Danilo Jimenez Rezende

The mutual information is a core statistical quantity that has applications in all areas of machine learning, whether this is in training of density models over multiple data modalities, in maximising the efficiency of noisy transmission channels, or when learning behaviour policies for exploration by artificial agents.

BIG-bench Machine Learning reinforcement-learning +2

Variational Inference with Normalizing Flows

16 code implementations21 May 2015 Danilo Jimenez Rezende, Shakir Mohamed

The choice of approximate posterior distribution is one of the core problems in variational inference.

Variational Inference

DRAW: A Recurrent Neural Network For Image Generation

20 code implementations16 Feb 2015 Karol Gregor, Ivo Danihelka, Alex Graves, Danilo Jimenez Rezende, Daan Wierstra

This paper introduces the Deep Recurrent Attentive Writer (DRAW) neural network architecture for image generation.

Ranked #70 on Image Generation on CIFAR-10 (bits/dimension metric)

Foveation Image Generation

Stochastic Backpropagation and Approximate Inference in Deep Generative Models

5 code implementations16 Jan 2014 Danilo Jimenez Rezende, Shakir Mohamed, Daan Wierstra

We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and learning.

Bayesian Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.