Optimizing Few-Step Diffusion Samplers by Gradient Descent

Denoising Diffusion Probabilistic Models (DDPMs) have emerged as a flexible family of generative models rivaling GANs and autoregressive models in sample quality and likelihoods. DDPMs however typically require hundreds of inference steps to generate a high-fidelity image, despite recent progress on speeding up diffusion model sampling. We introduce Differentiable Diffusion Sampler Search (DDSS): a method that learns few-step samplers for any pre-trained DDPM by using gradient descent. We propose Generalized Gaussian Diffusion Processes (GGDP), a family of non-Markovian samplers for diffusion models, and we show how to improve the generated samples of pre-trained DDPMs by optimizing the degrees of freedom of the GGDP sampler family with respect to a perceptual loss. Our optimization procedure backpropagates through the sampling process using the reparameterization trick. Searching our novel GGDP family with DDSS, we achieve strong results on unconditional image generation on both CIFAR and ImageNet 64x64 (e.g., 7.59 on CIFAR10 with only 10 inference steps, and 4.67 with 25 steps, compared to 13.62 and 6.56 with the strongest respective DDIM($\eta=0$) baselines). Our method is compatible with any pre-trained DDPM without re-training, only needs to be applied once, and does not finetune the parameters of the pre-trained DDPM.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods