# Adjusting for Dropout Variance in Batch Normalization and Weight Initialization

8 Jul 2016Dan HendrycksKevin Gimpel

We show how to adjust for the variance introduced by dropout with corrections to weight initialization and Batch Normalization, yielding higher accuracy. Though dropout can preserve the expected input to a neuron between train and test, the variance of the input differs... (read more)

PDF Abstract