GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium

Generative Adversarial Networks (GANs) excel at creating realistic images with complex models for which maximum likelihood is infeasible. However, the convergence of GAN training has still not been proved... (read more)

PDF Abstract NeurIPS 2017 PDF NeurIPS 2017 Abstract
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Image Generation CIFAR-10 WGAN-GP + TT Update Rule FID 24.8 # 37
Image Generation LSUN Bedroom 64 x 64 WGAN-GP + TT Update Rule FID 9.5 # 1

Methods used in the Paper


METHOD TYPE
Adam
Stochastic Optimization
ReLU
Activation Functions
DCGAN
Generative Models
Layer Normalization
Normalization
WGAN-GP Loss
Loss Functions
Leaky ReLU
Activation Functions
Batch Normalization
Normalization
WGAN GP
Generative Adversarial Networks
TTUR
Optimization
Convolution
Convolutions
GAN
Generative Models