Contractive Diffusion Probabilistic Models

23 Jan 2024  ·  Wenpin Tang, Hanyang Zhao ·

Diffusion probabilistic models (DPMs) have emerged as a promising technique in generative modeling. The success of DPMs relies on two ingredients: time reversal of diffusion processes and score matching. Most existing works implicitly assume that score matching is close to perfect, while this assumption is questionable. In view of possibly unguaranteed score matching, we propose a new criterion -- the contraction of backward sampling in the design of DPMs, leading to a novel class of contractive DPMs (CDPMs). The key insight is that the contraction in the backward process can narrow score matching errors and discretization errors. Thus, our proposed CDPMs are robust to both sources of error. For practical use, we show that CDPM can leverage pretrained DPMs by a simple transformation, and does not need retraining. We corroborated our approach by experiments on synthetic 1-dim examples, Swiss Roll, MNIST, CIFAR-10 32$\times$32 and AFHQ 64$\times$64 dataset. Notably, CDPM shows the best performance among all known SDE-based DPMs.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods