Improved Denoising Diffusion Probabilistic Models

18 Feb 2021  ·  Alex Nichol, Prafulla Dhariwal ·

Denoising diffusion probabilistic models (DDPM) are a class of generative models which have recently been shown to produce excellent samples. We show that with a few simple modifications, DDPMs can also achieve competitive log-likelihoods while maintaining high sample quality. Additionally, we find that learning variances of the reverse diffusion process allows sampling with an order of magnitude fewer forward passes with a negligible difference in sample quality, which is important for the practical deployment of these models. We additionally use precision and recall to compare how well DDPMs and GANs cover the target distribution. Finally, we show that the sample quality and likelihood of these models scale smoothly with model capacity and training compute, making them easily scalable. We release our code at https://github.com/openai/improved-diffusion

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Image Generation CIFAR-10 Improved DDPM (DINOv2) FD 212.3 # 5
Image Generation CIFAR-10 Improved DDPM FID 3.27 # 49
bits/dimension 2.94 # 25
Image Generation ImageNet 256x256 Improved DDPM FID 12.3 # 42
Image Generation ImageNet 64x64 Improved DDPM Bits per dim 3.53 # 12
FID 2.92 # 8

Methods