Constructive universal distribution generation through deep ReLU networks

We present an explicit deep network construction that transforms uniformly distributed one-dimensional noise into an arbitrarily close approximation of any two-dimensional target distribution of finite differential entropy and Lipschitz-continuous pdf. The key ingredient of our design is a generalization of the "space-filling'' property of sawtooth functions introduced in (Bailey & Telgarsky, 2018). We elicit the importance of depth in our construction in driving the Wasserstein distance between the target distribution and its approximation realized by the proposed neural network to zero. Finally, we outline how our construction can be extended to output distributions of arbitrary dimension.

PDF ICML 2020 PDF
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here