Filter Response Normalization Layer: Eliminating Batch Dependence in the Training of Deep Neural Networks

Batch Normalization (BN) uses mini-batch statistics to normalize the activations during training, introducing dependence between mini-batch elements. This dependency can hurt the performance if the mini-batch size is too small, or if the elements are correlated... (read more)

PDF Abstract CVPR 2020 PDF CVPR 2020 Abstract

Datasets


Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK BENCHMARK
Image Classification ImageNet ResnetV2 50 (FRN layer) Top 1 Accuracy 77.21% # 129
Top 5 Accuracy 93.57% # 80
Image Classification ImageNet InceptionV3 (FRN layer) Top 1 Accuracy 78.95% # 102
Top 5 Accuracy 94.49% # 62

Methods used in the Paper


METHOD TYPE
Residual Connection
Skip Connections
1x1 Convolution
Convolutions
Average Pooling
Pooling Operations
Focal Loss
Loss Functions
FPN
Feature Extractors
RetinaNet
Object Detection Models
Linear Warmup With Cosine Annealing
Learning Rate Schedules
VGG
Convolutional Neural Networks
Global Average Pooling
Pooling Operations
Bottleneck Residual Block
Skip Connection Blocks
Residual Block
Skip Connection Blocks
Kaiming Initialization
Initialization
Batch Normalization
Normalization
ReLU
Activation Functions
ResNet
Convolutional Neural Networks
Label Smoothing
Regularization
RMSProp
Stochastic Optimization
Auxiliary Classifier
Miscellaneous Components
Convolution
Convolutions
Dense Connections
Feedforward Networks
Max Pooling
Pooling Operations
Inception-v3 Module
Image Model Blocks
Dropout
Regularization
Softmax
Output Functions
Inception-v3
Convolutional Neural Networks
Filter Response Normalization
Normalization