Sequential Normalization: an improvement over Ghost Normalization

1 Jan 2021  ·  Neofytos Dimitriou, Ognjen Arandjelovic ·

Batch normalization (BatchNorm) is an effective yet poorly understood technique for neural network optimization. It is often assumed that the degradation in BatchNorm performance to smaller batch sizes stems from it having to estimate layer statistics using smaller sample sizes. However, recently, Ghost normalization (GhostNorm), a variant of BatchNorm that explicitly uses smaller sample sizes for normalization, has been shown to improve upon BatchNorm in some datasets. Our contributions are: (i) three types of GhostNorm implementations are described, two of which employ BatchNorm as the underlying normalization technique, (ii) we uncover a source of regularization that is unique to GhostNorm, and not simply an extension from BatchNorm, and visualise the difference in their loss landscapes, (iii) we extend GhostNorm and introduce a new type of normalization layer called Sequential Normalization (SeqNorm), (iv) we compare both GhostNorm and SeqNorm against BatchNorm alone as well as with other regularisation techniques, (v) for both GhostNorm and SeqNorm, we report superior performance over state-of-the-art methodologies on CIFAR--10, CIFAR--100, and ImageNet data sets.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here