Metropolis-Hastings view on variational inference and adversarial training

A significant part of MCMC methods can be considered as the Metropolis-Hastings (MH) algorithm with different proposal distributions. From this point of view, the problem of constructing a sampler can be reduced to the question - how to choose a proposal for the MH algorithm?.. (read more)

PDF Abstract ICLR 2019 PDF ICLR 2019 Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper


METHOD TYPE
Dense Connections
Feedforward Networks
Softmax
Output Functions
ReLU
Activation Functions
Feedforward Network
Feedforward Networks
Conditional Batch Normalization
Normalization
Residual Block
Skip Connection Blocks
TTUR
Optimization
GAN Hinge Loss
Loss Functions
Residual Connection
Skip Connections
Non-Local Operation
Image Feature Extractors
Non-Local Block
Image Model Blocks
Truncation Trick
Latent Variable Sampling
Linear Layer
Feedforward Networks
Dot-Product Attention
Attention Mechanisms
Projection Discriminator
Discriminators
Spectral Normalization
Normalization
Off-Diagonal Orthogonal Regularization
Regularization
Convolution
Convolutions
Adam
Stochastic Optimization
Batch Normalization
Normalization
Early Stopping
Regularization
1x1 Convolution
Convolutions
SAGAN Self-Attention Module
Attention Modules
SAGAN
Generative Adversarial Networks
BigGAN
Generative Models