Multi-Resolution Continuous Normalizing Flows

15 Jun 2021  ·  Vikram Voleti, Chris Finlay, Adam Oberman, Christopher Pal ·

Recent work has shown that Neural Ordinary Differential Equations (ODEs) can serve as generative models of images using the perspective of Continuous Normalizing Flows (CNFs). Such models offer exact likelihood calculation, and invertible generation/density estimation. In this work we introduce a Multi-Resolution variant of such models (MRCNF), by characterizing the conditional distribution over the additional information required to generate a fine image that is consistent with the coarse image. We introduce a transformation between resolutions that allows for no change in the log likelihood. We show that this approach yields comparable likelihood values for various image datasets, with improved performance at higher resolutions, with fewer parameters, using only 1 GPU. Further, we examine the out-of-distribution properties of (Multi-Resolution) Continuous Normalizing Flows, and find that they are similar to those of other likelihood-based generative models.

PDF Abstract

Results from the Paper


Ranked #6 on Image Generation on ImageNet 64x64 (Bits per dim metric)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Density Estimation CIFAR-10 MRCNF NLL (bits/dim) 3.54 # 10
Image Generation CIFAR-10 MRCNF bits/dimension 3.54 # 63
Image Generation ImageNet 32x32 MRCNF bpd 3.77 # 7
Image Generation ImageNet 64x64 MRCNF Bits per dim 3.44 # 6

Methods