Paper

Conditioning Trick for Training Stable GANs

In this paper we propose a conditioning trick, called difference departure from normality, applied on the generator network in response to instability issues during GAN training. We force the generator to get closer to the departure from normality function of real samples computed in the spectral domain of Schur decomposition... (read more)

Results in Papers With Code
(↓ scroll down to see all results)