Normalization

Virtual Batch Normalization

Introduced by Salimans et al. in Improved Techniques for Training GANs

Virtual Batch Normalization is a normalization method used for training generative adversarial networks that extends batch normalization. Regular batch normalization causes the output of a neural network for an input example $\mathbf{x}$ to be highly dependent on several other inputs $\mathbf{x}'$ in the same minibatch. To avoid this problem in virtual batch normalization (VBN), each example $\mathbf{x}$ is normalized based on the statistics collected on a reference batch of examples that are chosen once and fixed at the start of training, and on $\mathbf{x}$ itself. The reference batch is normalized using only its own statistics. VBN is computationally expensive because it requires running forward propagation on two minibatches of data, so the authors use it only in the generator network.

Source: Improved Techniques for Training GANs

Papers


Paper Code Results Date Stars

Tasks


Categories