Learning Generative Models using Denoising Density Estimators

8 Jan 2020  ยท  Siavash A. Bigdeli, Geng Lin, Tiziano Portenier, L. Andrea Dunbar, Matthias Zwicker ยท

Learning probabilistic models that can estimate the density of a given set of samples, and generate samples from that density, is one of the fundamental challenges in unsupervised machine learning. We introduce a new generative model based on denoising density estimators (DDEs), which are scalar functions parameterized by neural networks, that are efficiently trained to represent kernel density estimators of the data. Leveraging DDEs, our main contribution is a novel technique to obtain generative models by minimizing the KL-divergence directly. We prove that our algorithm for obtaining generative models is guaranteed to converge to the correct solution. Our approach does not require specific network architecture as in normalizing flows, nor use ordinary differential equation solvers as in continuous normalizing flows. Experimental results demonstrate substantial improvement in density estimation and competitive performance in generative model training.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Density Estimation UCI GAS DDE Log-likelihood 9.73 # 2
Density Estimation UCI HEPMASS DDE Log-likelihood -11.3 # 1
Density Estimation UCI MINIBOONE DDE Log-likelihood -6.94 # 1
NLL 6.94 # 1
Density Estimation UCI POWER DDE Log-likelihood 0.97 # 2

Methods


No methods listed for this paper. Add relevant methods here