Search Results for author: Michael S. Albergo

Found 17 papers, 4 papers with code

SiT: Exploring Flow and Diffusion-based Generative Models with Scalable Interpolant Transformers

1 code implementation16 Jan 2024 Nanye Ma, Mark Goldstein, Michael S. Albergo, Nicholas M. Boffi, Eric Vanden-Eijnden, Saining Xie

We present Scalable Interpolant Transformers (SiT), a family of generative models built on the backbone of Diffusion Transformers (DiT).

Image Generation

Learning to Sample Better

no code implementations17 Oct 2023 Michael S. Albergo, Eric Vanden-Eijnden

These lecture notes provide an introduction to recent advances in generative modeling methods based on the dynamical transportation of measures, by means of which samples from a simple base measure are mapped to samples from a target measure of interest.

Stochastic interpolants with data-dependent couplings

no code implementations5 Oct 2023 Michael S. Albergo, Mark Goldstein, Nicholas M. Boffi, Rajesh Ranganath, Eric Vanden-Eijnden

In this work, using the framework of stochastic interpolants, we formalize how to \textit{couple} the base and the target densities, whereby samples from the base are computed conditionally given samples from the target in a way that is different from (but does preclude) incorporating information about class labels or continuous embeddings.

Super-Resolution

Multimarginal generative modeling with stochastic interpolants

no code implementations5 Oct 2023 Michael S. Albergo, Nicholas M. Boffi, Michael Lindsey, Eric Vanden-Eijnden

Given a set of $K$ probability densities, we consider the multimarginal generative modeling problem of learning a joint distribution that recovers these densities as marginals.

Fairness Style Transfer

Stochastic Interpolants: A Unifying Framework for Flows and Diffusions

1 code implementation15 Mar 2023 Michael S. Albergo, Nicholas M. Boffi, Eric Vanden-Eijnden

The time-dependent probability density function of the stochastic interpolant is shown to satisfy a first-order transport equation as well as a family of forward and backward Fokker-Planck equations with tunable diffusion coefficient.

Denoising

Aspects of scaling and scalability for flow-based sampling of lattice QCD

no code implementations14 Nov 2022 Ryan Abbott, Michael S. Albergo, Aleksandar Botev, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Alexander G. D. G. Matthews, Sébastien Racanière, Ali Razavi, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Julian M. Urban

Recent applications of machine-learned normalizing flows to sampling in lattice field theory suggest that such methods may be able to mitigate critical slowing down and topological freezing.

Building Normalizing Flows with Stochastic Interpolants

1 code implementation30 Sep 2022 Michael S. Albergo, Eric Vanden-Eijnden

A generative model based on a continuous-time normalizing flow between any pair of base and target probability densities is proposed.

Benchmarking Density Estimation +1

Gauge-equivariant flow models for sampling in lattice field theories with pseudofermions

no code implementations18 Jul 2022 Ryan Abbott, Michael S. Albergo, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Betsy Tian, Julian M. Urban

This work presents gauge-equivariant architectures for flow-based sampling in fermionic lattice field theories using pseudofermions as stochastic estimators for the fermionic determinant.

Flow-based sampling in the lattice Schwinger model at criticality

no code implementations23 Feb 2022 Michael S. Albergo, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Julian M. Urban

In this work, we provide a numerical demonstration of robust flow-based sampling in the Schwinger model at the critical value of the fermion mass.

Flow-based sampling for multimodal distributions in lattice field theory

no code implementations1 Jul 2021 Daniel C. Hackett, Chung-Chun Hsieh, Michael S. Albergo, Denis Boyda, Jiunn-Wei Chen, Kai-Feng Chen, Kyle Cranmer, Gurtej Kanwar, Phiala E. Shanahan

Recent results have demonstrated that samplers constructed with flow-based generative models are a promising new approach for configuration generation in lattice field theory.

Flow-based sampling for fermionic lattice field theories

no code implementations10 Jun 2021 Michael S. Albergo, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Julian M. Urban, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Phiala E. Shanahan

Algorithms based on normalizing flows are emerging as promising machine learning approaches to sampling complicated probability distributions in a way that can be made asymptotically exact.

Introduction to Normalizing Flows for Lattice Field Theory

no code implementations20 Jan 2021 Michael S. Albergo, Denis Boyda, Daniel C. Hackett, Gurtej Kanwar, Kyle Cranmer, Sébastien Racanière, Danilo Jimenez Rezende, Phiala E. Shanahan

This notebook tutorial demonstrates a method for sampling Boltzmann distributions of lattice field theories using a class of machine learning models known as normalizing flows.

BIG-bench Machine Learning

Sampling using $SU(N)$ gauge equivariant flows

no code implementations12 Aug 2020 Denis Boyda, Gurtej Kanwar, Sébastien Racanière, Danilo Jimenez Rezende, Michael S. Albergo, Kyle Cranmer, Daniel C. Hackett, Phiala E. Shanahan

We develop a flow-based sampling algorithm for $SU(N)$ lattice gauge theories that is gauge-invariant by construction.

Equivariant flow-based sampling for lattice gauge theory

no code implementations13 Mar 2020 Gurtej Kanwar, Michael S. Albergo, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Sébastien Racanière, Danilo Jimenez Rezende, Phiala E. Shanahan

We define a class of machine-learned flow-based sampling algorithms for lattice gauge theories that are gauge-invariant by construction.

Cannot find the paper you are looking for? You can Submit a new open access paper.