Recurrent Batch Normalization

30 Mar 2016Tim CooijmansNicolas BallasCésar LaurentÇağlar GülçehreAaron Courville

We propose a reparameterization of LSTM that brings the benefits of batch normalization to recurrent neural networks. Whereas previous works only apply batch normalization to the input-to-hidden transformation of RNNs, we demonstrate that it is both possible and beneficial to batch-normalize the hidden-to-hidden transition, thereby reducing internal covariate shift between time steps... (read more)

PDF Abstract

Evaluation results from the paper


Task Dataset Model Metric name Metric value Global rank Compare
Sequential Image Classification Sequential MNIST BN LSTM Unpermuted Accuracy 99% # 3
Sequential Image Classification Sequential MNIST BN LSTM Permuted Accuracy 95.4% # 3
Language Modelling Text8 BN LSTM Bit per Character (BPC) 1.36 # 12
Language Modelling Text8 BN LSTM Number of params 16M # 1