Search Results for author: Emiel Hoogeboom

Found 23 papers, 13 papers with code

HexaConv

1 code implementation ICLR 2018 Emiel Hoogeboom, Jorn W. T. Peters, Taco S. Cohen, Max Welling

We find that, due to the reduced anisotropy of hexagonal filters, planar HexaConv provides better accuracy than planar convolution with square filters, given a fixed parameter budget.

Aerial Scene Classification Scene Classification

Emerging Convolutions for Generative Normalizing Flows

1 code implementation30 Jan 2019 Emiel Hoogeboom, Rianne van den Berg, Max Welling

We generalize the 1 x 1 convolutions proposed in Glow to invertible d x d convolutions, which are more flexible since they operate on both channel and spatial axes.

Image Generation

Integer Discrete Flows and Lossless Compression

1 code implementation NeurIPS 2019 Emiel Hoogeboom, Jorn W. T. Peters, Rianne van den Berg, Max Welling

For that reason, we introduce a flow-based generative model for ordinal discrete data called Integer Discrete Flow (IDF): a bijective integer map that can learn rich transformations on high-dimensional data.

Learning Likelihoods with Conditional Normalizing Flows

1 code implementation29 Nov 2019 Christina Winkler, Daniel Worrall, Emiel Hoogeboom, Max Welling

Normalizing Flows (NFs) are able to model complicated distributions p(y) with strong inter-dimensional correlations and high multimodality by transforming a simple base density p(z) through an invertible neural network under the change of variables formula.

Structured Prediction Super-Resolution

Predictive Sampling with Forecasting Autoregressive Models

no code implementations ICML 2020 Auke Wiggers, Emiel Hoogeboom

Autoregressive models (ARMs) currently hold state-of-the-art performance in likelihood-based modeling of image and audio data.

The Convolution Exponential and Generalized Sylvester Flows

1 code implementation NeurIPS 2020 Emiel Hoogeboom, Victor Garcia Satorras, Jakub M. Tomczak, Max Welling

Empirically, we show that the convolution exponential outperforms other linear transformations in generative flows on CIFAR10 and the graph convolution exponential improves the performance of graph normalizing flows.

SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

3 code implementations NeurIPS 2020 Didrik Nielsen, Priyank Jaini, Emiel Hoogeboom, Ole Winther, Max Welling

Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions.

Self Normalizing Flows

1 code implementation14 Nov 2020 T. Anderson Keller, Jorn W. T. Peters, Priyank Jaini, Emiel Hoogeboom, Patrick Forré, Max Welling

Efficient gradient computation of the Jacobian determinant term is a core problem in many machine learning settings, and especially so in the normalizing flow framework.

Argmax Flows: Learning Categorical Distributions with Normalizing Flows

no code implementations pproximateinference AABI Symposium 2021 Emiel Hoogeboom, Didrik Nielsen, Priyank Jaini, Patrick Forré, Max Welling

This paper introduces a new method to define and train continuous distributions such as normalizing flows directly on categorical data, for example text and image segmentation.

Image Segmentation Semantic Segmentation

Variational Determinant Estimation with Spherical Normalizing Flows

no code implementations pproximateinference AABI Symposium 2021 Simon Passenheim, Emiel Hoogeboom

This paper introduces the Variational Determinant Estimator (VDE), a variational extension of the recently proposed determinant estimator discovered by arXiv:2005. 06553v2.

Density Estimation Variational Inference

E(n) Equivariant Graph Neural Networks

5 code implementations19 Feb 2021 Victor Garcia Satorras, Emiel Hoogeboom, Max Welling

This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E(n)-Equivariant Graph Neural Networks (EGNNs).

Representation Learning

E(n) Equivariant Normalizing Flows

1 code implementation NeurIPS 2021 Victor Garcia Satorras, Emiel Hoogeboom, Fabian B. Fuchs, Ingmar Posner, Max Welling

This paper introduces a generative model equivariant to Euclidean symmetries: E(n) Equivariant Normalizing Flows (E-NFs).

Discrete Denoising Flows

no code implementations ICML Workshop INNF 2021 Alexandra Lindt, Emiel Hoogeboom

Discrete flow-based models are a recently proposed class of generative models that learn invertible transformations for discrete random variables.

Denoising

Autoregressive Diffusion Models

2 code implementations ICLR 2022 Emiel Hoogeboom, Alexey A. Gritsenko, Jasmijn Bastings, Ben Poole, Rianne van den Berg, Tim Salimans

We introduce Autoregressive Diffusion Models (ARDMs), a model class encompassing and generalizing order-agnostic autoregressive models (Uria et al., 2014) and absorbing discrete diffusion (Austin et al., 2021), which we show are special cases of ARDMs under mild assumptions.

Ranked #8 on Image Generation on CIFAR-10 (bits/dimension metric)

Image Generation

Equivariant Diffusion for Molecule Generation in 3D

3 code implementations31 Mar 2022 Emiel Hoogeboom, Victor Garcia Satorras, Clément Vignac, Max Welling

This work introduces a diffusion model for molecule generation in 3D that is equivariant to Euclidean transformations.

Blurring Diffusion Models

no code implementations12 Sep 2022 Emiel Hoogeboom, Tim Salimans

Recently, Rissanen et al., (2022) have presented a new type of diffusion process for generative modeling based on heat dissipation, or blurring, as an alternative to isotropic Gaussian diffusion.

Denoising Inductive Bias

High-Fidelity Image Compression with Score-based Generative Models

no code implementations26 May 2023 Emiel Hoogeboom, Eirikur Agustsson, Fabian Mentzer, Luca Versari, George Toderici, Lucas Theis

Despite the tremendous success of diffusion generative models in text-to-image generation, replicating this success in the domain of image compression has proven difficult.

Image Compression Text-to-Image Generation

DORSal: Diffusion for Object-centric Representations of Scenes et al

no code implementations13 Jun 2023 Allan Jabri, Sjoerd van Steenkiste, Emiel Hoogeboom, Mehdi S. M. Sajjadi, Thomas Kipf

In this paper, we leverage recent progress in diffusion models to equip 3D scene representation learning models with the ability to render high-fidelity novel views, while retaining benefits such as object-level scene editing to a large degree.

Neural Rendering Object +3

Rolling Diffusion Models

no code implementations12 Feb 2024 David Ruhe, Jonathan Heek, Tim Salimans, Emiel Hoogeboom

Diffusion models have recently been increasingly applied to temporal data such as video, fluid mechanics simulations, or climate data.

Denoising Video Prediction

Multistep Consistency Models

no code implementations11 Mar 2024 Jonathan Heek, Emiel Hoogeboom, Tim Salimans

By increasing the sample budget from a single step to 2-8 steps, we can train models more easily that generate higher quality samples, while retaining much of the sampling speed benefits.

Cannot find the paper you are looking for? You can Submit a new open access paper.