LOGAN: Latent Optimisation for Generative Adversarial Networks

Training generative adversarial networks requires balancing of delicate adversarial dynamics. Even with careful tuning, training may diverge or end up in a bad equilibrium with dropped modes... (read more)

PDF Abstract

Datasets


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Conditional Image Generation ImageNet 128x128 LOGAN (NGD) FID 3.36 # 1
Inception score 148.2 # 3

Methods used in the Paper


METHOD TYPE
Dense Connections
Feedforward Networks
Euclidean Norm Regularization
Regularization
Softmax
Output Functions
Natural Gradient Descent
Optimization
Bottleneck Residual Block
Skip Connection Blocks
Feedforward Network
Feedforward Networks
Residual Connection
Skip Connections
Non-Local Operation
Image Feature Extractors
1x1 Convolution
Convolutions
Dot-Product Attention
Attention Mechanisms
SAGAN Self-Attention Module
Attention Modules
Adam
Stochastic Optimization
SNGAN
Generative Models
Linear Layer
Feedforward Networks
Non-Local Block
Image Model Blocks
Truncation Trick
Latent Variable Sampling
Conditional Batch Normalization
Normalization
TTUR
Optimization
GAN Hinge Loss
Loss Functions
Early Stopping
Regularization
Spectral Normalization
Normalization
SAGAN
Generative Adversarial Networks
Projection Discriminator
Discriminators
Off-Diagonal Orthogonal Regularization
Regularization
Batch Normalization
Normalization
ReLU
Activation Functions
Leaky ReLU
Activation Functions
DCGAN
Generative Models
Latent Optimisation
Latent Variable Sampling
CS-GAN
Generative Models
LOGAN
Generative Models
BigGAN-deep
Generative Models
Convolution
Convolutions