Search Results for author: Rianne van den Berg

Found 18 papers, 13 papers with code

Two for One: Diffusion Models and Force Fields for Coarse-Grained Molecular Dynamics

no code implementations1 Feb 2023 Marloes Arts, Victor Garcia Satorras, Chin-wei Huang, Daniel Zuegner, Marco Federici, Cecilia Clementi, Frank Noé, Robert Pinsler, Rianne van den Berg

Coarse-grained (CG) molecular dynamics enables the study of biological processes at temporal and spatial scales that would be intractable at an atomistic resolution.

Protein Folding

Protein structure generation via folding diffusion

1 code implementation30 Sep 2022 Kevin E. Wu, Kevin K. Yang, Rianne van den Berg, James Y. Zou, Alex X. Lu, Ava P. Amini

The ability to computationally generate novel yet physically foldable protein structures could lead to new biological discoveries and new treatments targeting yet incurable diseases.

Denoising Protein Structure Prediction

Clifford Neural Layers for PDE Modeling

1 code implementation8 Sep 2022 Johannes Brandstetter, Rianne van den Berg, Max Welling, Jayesh K. Gupta

We empirically evaluate the benefit of Clifford neural layers by replacing convolution and Fourier operations in common neural PDE surrogates by their Clifford counterparts on 2D Navier-Stokes and weather modeling tasks, as well as 3D Maxwell equations.

Weather Forecasting

Autoregressive Diffusion Models

2 code implementations ICLR 2022 Emiel Hoogeboom, Alexey A. Gritsenko, Jasmijn Bastings, Ben Poole, Rianne van den Berg, Tim Salimans

We introduce Autoregressive Diffusion Models (ARDMs), a model class encompassing and generalizing order-agnostic autoregressive models (Uria et al., 2014) and absorbing discrete diffusion (Austin et al., 2021), which we show are special cases of ARDMs under mild assumptions.

Ranked #8 on Image Generation on CIFAR-10 (bits/dimension metric)

Image Generation

Gradual Domain Adaptation in the Wild: When Intermediate Distributions are Absent

no code implementations29 Sep 2021 Samira Abnar, Rianne van den Berg, Golnaz Ghiasi, Mostafa Dehghani, Nal Kalchbrenner, Hanie Sedghi

It is shown that under the following two assumptions: (a) access to samples from intermediate distributions, and (b) samples being annotated with the amount of change from the source distribution; self-training can be successfully applied on gradually shifted samples to adapt the model toward the target distribution.

Domain Adaptation

Beyond In-Place Corruption: Insertion and Deletion In Denoising Probabilistic Models

no code implementations ICML Workshop INNF 2021 Daniel D. Johnson, Jacob Austin, Rianne van den Berg, Daniel Tarlow

Denoising diffusion probabilistic models (DDPMs) have shown impressive results on sequence generation by iteratively corrupting each example and then learning to map corrupted versions back to the original.

Denoising

Structured Denoising Diffusion Models in Discrete State-Spaces

3 code implementations NeurIPS 2021 Jacob Austin, Daniel D. Johnson, Jonathan Ho, Daniel Tarlow, Rianne van den Berg

Here, we introduce Discrete Denoising Diffusion Probabilistic Models (D3PMs), diffusion-like generative models for discrete data that generalize the multinomial diffusion model of Hoogeboom et al. 2021, by going beyond corruption processes with uniform transition probabilities.

Denoising Text Generation

Gradual Domain Adaptation in the Wild:When Intermediate Distributions are Absent

1 code implementation10 Jun 2021 Samira Abnar, Rianne van den Berg, Golnaz Ghiasi, Mostafa Dehghani, Nal Kalchbrenner, Hanie Sedghi

It has been shown that under the following two assumptions: (a) access to samples from intermediate distributions, and (b) samples being annotated with the amount of change from the source distribution, self-training can be successfully applied on gradually shifted samples to adapt the model toward the target distribution.

Domain Adaptation

A Spectral Energy Distance for Parallel Speech Synthesis

2 code implementations NeurIPS 2020 Alexey A. Gritsenko, Tim Salimans, Rianne van den Berg, Jasper Snoek, Nal Kalchbrenner

Speech synthesis is an important practical generative modeling problem that has seen great progress over the last few years, with likelihood-based autoregressive neural models now outperforming traditional concatenative systems.

Speech Synthesis

IDF++: Analyzing and Improving Integer Discrete Flows for Lossless Compression

no code implementations ICLR 2021 Rianne van den Berg, Alexey A. Gritsenko, Mostafa Dehghani, Casper Kaae Sønderby, Tim Salimans

Furthermore, we zoom in on the effect of gradient bias due to the straight-through estimator in integer discrete flows, and demonstrate that its influence is highly dependent on architecture choices and less prominent than previously thought.

Quantization

Differentiable probabilistic models of scientific imaging with the Fourier slice theorem

1 code implementation18 Jun 2019 Karen Ullrich, Rianne van den Berg, Marcus Brubaker, David Fleet, Max Welling

Finally, we demonstrate how the reconstruction algorithm can be extended with an amortized inference scheme on unknown attributes such as object pose.

3D Reconstruction Computational Efficiency +3

Integer Discrete Flows and Lossless Compression

1 code implementation NeurIPS 2019 Emiel Hoogeboom, Jorn W. T. Peters, Rianne van den Berg, Max Welling

For that reason, we introduce a flow-based generative model for ordinal discrete data called Integer Discrete Flow (IDF): a bijective integer map that can learn rich transformations on high-dimensional data.

Emerging Convolutions for Generative Normalizing Flows

1 code implementation30 Jan 2019 Emiel Hoogeboom, Rianne van den Berg, Max Welling

We generalize the 1 x 1 convolutions proposed in Glow to invertible d x d convolutions, which are more flexible since they operate on both channel and spatial axes.

Image Generation

Predictive Uncertainty through Quantization

no code implementations12 Oct 2018 Bastiaan S. Veeling, Rianne van den Berg, Max Welling

High-risk domains require reliable confidence estimates from predictive models.

Quantization

Sinkhorn AutoEncoders

2 code implementations ICLR 2019 Giorgio Patrini, Rianne van den Berg, Patrick Forré, Marcello Carioni, Samarth Bhargav, Max Welling, Tim Genewein, Frank Nielsen

We show that minimizing the p-Wasserstein distance between the generator and the true data distribution is equivalent to the unconstrained min-min optimization of the p-Wasserstein distance between the encoder aggregated posterior and the prior in latent space, plus a reconstruction error.

Probabilistic Programming

Graph Convolutional Matrix Completion

15 code implementations7 Jun 2017 Rianne van den Berg, Thomas N. Kipf, Max Welling

We consider matrix completion for recommender systems from the point of view of link prediction on graphs.

Ranked #4 on Recommendation Systems on YahooMusic Monti (using extra training data)

Collaborative Filtering Link Prediction +2

Cannot find the paper you are looking for? You can Submit a new open access paper.