# Approximation for Probability Distributions by Wasserstein GAN

18 Mar 2021  ·  , , ·

In this paper, we study Wasserstein Generative Adversarial Networks (WGAN) using GroupSort neural networks as discriminators. We show that the error bound of the approximation for the target distribution depends on both the width/depth (capacity) of generators and discriminators, as well as the number of samples in training. A quantified generalization bound is established for Wasserstein distance between the generated distribution and the target distribution. According to our theoretical results, WGAN has higher requirement for the capacity of discriminators than that of generators, which is consistent with some existing theories. More importantly, overly deep and wide (high capacity) generators may cause worse results than low capacity generators if discriminators are not strong enough. Moreover, we further develop a generalization error bound, which is free from curse of dimensionality w.r.t. numbers of training data. It reduces from $\widetilde{\mathcal{O}}(m^{-1/r} + n^{-1/d})$ to $\widetilde{\mathcal{O}}(\text{Pdim}(\mathcal{F}\circ \mathcal{G}) \cdot m^{-1/2} + \text{Pdim}(\mathcal{F}) \cdot n^{-1/2})$. However, the latter seems to contradict to numerical observations. Compared with existing results, our results are suitable for general GroupSort WGAN without special architectures. Finally, numerical results on swiss roll and MNIST data sets confirm our theoretical results.

PDF Abstract

## Code Add Remove Mark official

No code implementations yet. Submit your code now