Learning Disconnected Manifolds: Avoiding The No Gan's Land by Latent Rejection

1 Jan 2021  ·  Thibaut Issenhuth, Ugo Tanielian, David Picard, Jeremie Mary ·

Standard formulations of GANs, where a continuous function deforms a connected latent space, have been shown to be misspecified when fitting disconnected manifolds. In particular, when covering different classes of images, the generator will necessarily sample some low quality images in between the modes. Rather than modify the learning procedure, a line of works aims at improving the sampling quality from trained generators. Thus, it is now common to introduce a rejection step within the generation procedure. Building on this, we propose to train an additional network and transform the latent space via an adversarial learning of importance weights. This idea has several advantages: 1) it provides a way to inject disconnectedness on any GAN architecture, 2) the rejection avoids going through both the generator and the discriminator saving computation time, 3) this importance weights formulation provides a principled way to estimate the Wasserstein's distance to the true distribution, enabling its minimization. We demonstrate the effectiveness of our method on different datasets, both synthetic and high dimensional, and stress its superiority on highly disconnected data.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here