Relaxed Wasserstein with Applications to GANs

19 May 2017  ·  Xin Guo, Johnny Hong, Tianyi Lin, Nan Yang ·

Wasserstein Generative Adversarial Networks (WGANs) provide a versatile class of models, which have attracted great attention in various applications. However, this framework has two main drawbacks: (i) Wasserstein-1 (or Earth-Mover) distance is restrictive such that WGANs cannot always fit data geometry well; (ii) It is difficult to achieve fast training of WGANs. In this paper, we propose a new class of \textit{Relaxed Wasserstein} (RW) distances by generalizing Wasserstein-1 distance with Bregman cost functions. We show that RW distances achieve nice statistical properties while not sacrificing the computational tractability. Combined with the GANs framework, we develop Relaxed WGANs (RWGANs) which are not only statistically flexible but can be approximated efficiently using heuristic approaches. Experiments on real images demonstrate that the RWGAN with Kullback-Leibler (KL) cost function outperforms other competing approaches, e.g., WGANs, even with gradient penalty.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here