Search Results for author: Danilo J. Rezende

Found 27 papers, 10 papers with code

Applications of flow models to the generation of correlated lattice QCD ensembles

no code implementations19 Jan 2024 Ryan Abbott, Aleksandar Botev, Denis Boyda, Daniel C. Hackett, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Julian M. Urban

Machine-learned normalizing flows can be used in the context of lattice quantum field theory to generate statistically correlated ensembles of lattice gauge fields at different action parameters.

Advances in machine-learning-based sampling motivated by lattice quantum chromodynamics

no code implementations3 Sep 2023 Kyle Cranmer, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Phiala E. Shanahan

This Perspective outlines the advances in ML-based sampling motivated by lattice quantum field theory, in particular for the theory of quantum chromodynamics.

Audio Generation

Aspects of scaling and scalability for flow-based sampling of lattice QCD

no code implementations14 Nov 2022 Ryan Abbott, Michael S. Albergo, Aleksandar Botev, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Alexander G. D. G. Matthews, Sébastien Racanière, Ali Razavi, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Julian M. Urban

Recent applications of machine-learned normalizing flows to sampling in lattice field theory suggest that such methods may be able to mitigate critical slowing down and topological freezing.

Gauge-equivariant flow models for sampling in lattice field theories with pseudofermions

no code implementations18 Jul 2022 Ryan Abbott, Michael S. Albergo, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Betsy Tian, Julian M. Urban

This work presents gauge-equivariant architectures for flow-based sampling in fermionic lattice field theories using pseudofermions as stochastic estimators for the fermionic determinant.

Flow-based sampling in the lattice Schwinger model at criticality

no code implementations23 Feb 2022 Michael S. Albergo, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Julian M. Urban

In this work, we provide a numerical demonstration of robust flow-based sampling in the Schwinger model at the critical value of the fermion mass.

Continual Repeated Annealed Flow Transport Monte Carlo

2 code implementations31 Jan 2022 Alexander G. D. G. Matthews, Michael Arbel, Danilo J. Rezende, Arnaud Doucet

We propose Continual Repeated Annealed Flow Transport Monte Carlo (CRAFT), a method that combines a sequential Monte Carlo (SMC) sampler (itself a generalization of Annealed Importance Sampling) with variational inference using normalizing flows.

Variational Inference

Implicit Riemannian Concave Potential Maps

no code implementations4 Oct 2021 Danilo J. Rezende, Sébastien Racanière

We are interested in the challenging problem of modelling densities on Riemannian manifolds with a known symmetry group using normalising flows.

Density Estimation Normalising Flows

Flow-based sampling for fermionic lattice field theories

no code implementations10 Jun 2021 Michael S. Albergo, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Julian M. Urban, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Phiala E. Shanahan

Algorithms based on normalizing flows are emerging as promising machine learning approaches to sampling complicated probability distributions in a way that can be made asymptotically exact.

NeRF-VAE: A Geometry Aware 3D Scene Generative Model

1 code implementation1 Apr 2021 Adam R. Kosiorek, Heiko Strathmann, Daniel Zoran, Pol Moreno, Rosalia Schneider, Soňa Mokrá, Danilo J. Rezende

We propose NeRF-VAE, a 3D scene generative model that incorporates geometric structure via NeRF and differentiable volume rendering.

PARTS: Unsupervised Segmentation With Slots, Attention and Independence Maximization

no code implementations ICCV 2021 Daniel Zoran, Rishabh Kabra, Alexander Lerchner, Danilo J. Rezende

We present a model that is able to segment visual scenes from complex 3D environments into distinct objects, learn disentangled representations of individual objects, and form consistent and coherent predictions of future frames, in a fully unsupervised manner.

Representation Learning Scene Segmentation

Amortized learning of neural causal representations

no code implementations21 Aug 2020 Nan Rosemary Ke, Jane. X. Wang, Jovana Mitrovic, Martin Szummer, Danilo J. Rezende

The CRN represent causal models using continuous representations and hence could scale much better with the number of variables.

Towards Interpretable Reinforcement Learning Using Attention Augmented Agents

1 code implementation NeurIPS 2019 Alex Mott, Daniel Zoran, Mike Chrzanowski, Daan Wierstra, Danilo J. Rezende

Inspired by recent work in attention models for image captioning and question answering, we present a soft attention model for the reinforcement learning domain.

Image Captioning Question Answering +2

Consistent Generative Query Networks

no code implementations ICLR 2019 Ananya Kumar, S. M. Ali Eslami, Danilo J. Rezende, Marta Garnelo, Fabio Viola, Edward Lockhart, Murray Shanahan

These models typically generate future frames in an autoregressive fashion, which is slow and requires the input and output frames to be consecutive.

3D Scene Reconstruction Video Prediction

Learning models for visual 3D localization with implicit mapping

no code implementations4 Jul 2018 Dan Rosenbaum, Frederic Besse, Fabio Viola, Danilo J. Rezende, S. M. Ali Eslami

We consider learning based methods for visual localization that do not require the construction of explicit maps in the form of point clouds or voxels.

Visual Localization

Neural Processes

13 code implementations4 Jul 2018 Marta Garnelo, Jonathan Schwarz, Dan Rosenbaum, Fabio Viola, Danilo J. Rezende, S. M. Ali Eslami, Yee Whye Teh

A neural network (NN) is a parameterised function that can be tuned via gradient descent to approximate a labelled collection of data with high precision.

Beyond Greedy Ranking: Slate Optimization via List-CVAE

1 code implementation ICLR 2019 Ray Jiang, Sven Gowal, Timothy A. Mann, Danilo J. Rezende

The conventional solution to the recommendation problem greedily ranks individual document candidates by prediction scores.

Variational Memory Addressing in Generative Models

1 code implementation NeurIPS 2017 Jörg Bornschein, andriy mnih, Daniel Zoran, Danilo J. Rezende

Aiming to augment generative models with external memory, we interpret the output of a memory module with stochastic addressing as a conditional mixture distribution, where a read operation corresponds to sampling a discrete memory address and retrieving the corresponding content from memory.

Few-Shot Learning Representation Learning +1

Generative Temporal Models with Memory

no code implementations15 Feb 2017 Mevlana Gemici, Chia-Chun Hung, Adam Santoro, Greg Wayne, Shakir Mohamed, Danilo J. Rezende, David Amos, Timothy Lillicrap

We consider the general problem of modeling temporal data with long-range dependencies, wherein new observations are fully or partially predictable based on temporally-distant, past observations.

Variational Inference

Variational inference for Monte Carlo objectives

1 code implementation22 Feb 2016 Andriy Mnih, Danilo J. Rezende

Recent progress in deep latent variable models has largely been driven by the development of flexible and scalable variational inference methods.

Variational Inference

Semi-Supervised Learning with Deep Generative Models

18 code implementations NeurIPS 2014 Diederik P. Kingma, Danilo J. Rezende, Shakir Mohamed, Max Welling

The ever-increasing size of modern data sets combined with the difficulty of obtaining label information has made semi-supervised learning one of the problems of significant practical importance in modern data analysis.

Bayesian Inference

Variational Learning for Recurrent Spiking Networks

no code implementations NeurIPS 2011 Danilo J. Rezende, Daan Wierstra, Wulfram Gerstner

We derive a plausible learning rule updating the synaptic efficacies for feedforward, feedback and lateral connections between observed and latent neurons.

Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.