High-Performance Large-Scale Image Recognition Without Normalization

11 Feb 2021 Andrew Brock Soham De Samuel L. Smith Karen Simonyan

Batch normalization is a key component of most image classification models, but it has many undesirable properties stemming from its dependence on the batch size and interactions between examples. Although recent work has succeeded in training deep ResNets without normalization layers, these models do not match the test accuracies of the best batch-normalized networks, and are often unstable for large learning rates or strong data augmentations... (read more)

PDF Abstract

Datasets


Results from the Paper


Ranked #3 on Image Classification on ImageNet (using extra training data)

     Get a GitHub badge
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK USES EXTRA
TRAINING DATA
RESULT BENCHMARK
Image Classification ImageNet NFNet-F6 w/ SAM Top 1 Accuracy 86.5% # 14
Top 5 Accuracy 97.9% # 8
Number of params 438.4M # 8
Image Classification ImageNet NFNet-F4+ Top 1 Accuracy 89.2% # 3
Number of params 527M # 4
Image Classification ImageNet NFNet-F5 w/ SAM Top 1 Accuracy 86.3% # 16
Number of params 377.2M # 10
Image Classification ImageNet NFNet-F5 Top 1 Accuracy 86.0% # 18
Top 5 Accuracy 97.6% # 12
Image Classification ImageNet NFNet-F0 Top 1 Accuracy 83.6% # 40
Top 5 Accuracy 96.8% # 20
Number of params 71.5M # 33
Image Classification ImageNet NFNet-F4 Top 1 Accuracy 85.9% # 19
Number of params 316.1M # 11
Image Classification ImageNet NFNet-F1 Top 1 Accuracy 84.7% # 30
Top 5 Accuracy 97.1% # 17
Number of params 132.6M # 19
Image Classification ImageNet NFNet-F2 Top 1 Accuracy 85.1% # 27
Top 5 Accuracy 97.3% # 15
Number of params 193.8M # 15
Image Classification ImageNet NFNet-F3 Top 1 Accuracy 85.7% # 21
Top 5 Accuracy 97.5% # 13
Number of params 254.9M # 13
Image Classification ImageNet Top 1 Accuracy 83.6% # 40

Methods used in the Paper


METHOD TYPE
GELU
Activation Functions
Gradient Clipping
Optimization