Legacy SE ResNet

Last updated on Feb 14, 2021

legacy_seresnet101

Parameters 49 Million
FLOPs 10 Billion
File Size 188.66 MB
Training Data ImageNet
Training Resources 8x NVIDIA Titan X GPUs
Training Time

Training Techniques SGD with Momentum, Weight Decay, Label Smoothing
Architecture Squeeze-and-Excitation Block, 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID legacy_seresnet101
LR 0.6
Epochs 100
Layers 101
Dropout 0.2
Crop Pct 0.875
Momentum 0.9
Batch Size 1024
Image Size 224
Interpolation bilinear
SHOW MORE
SHOW LESS
legacy_seresnet152

Parameters 67 Million
FLOPs 15 Billion
File Size 255.62 MB
Training Data ImageNet
Training Resources 8x NVIDIA Titan X GPUs
Training Time

Training Techniques SGD with Momentum, Weight Decay, Label Smoothing
Architecture Squeeze-and-Excitation Block, 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID legacy_seresnet152
LR 0.6
Epochs 100
Layers 152
Dropout 0.2
Crop Pct 0.875
Momentum 0.9
Batch Size 1024
Image Size 224
Interpolation bilinear
SHOW MORE
SHOW LESS
legacy_seresnet18

Parameters 12 Million
FLOPs 2 Billion
File Size 44.99 MB
Training Data ImageNet
Training Resources 8x NVIDIA Titan X GPUs
Training Time

Training Techniques SGD with Momentum, Weight Decay, Label Smoothing
Architecture Squeeze-and-Excitation Block, 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID legacy_seresnet18
LR 0.6
Epochs 100
Layers 18
Dropout 0.2
Crop Pct 0.875
Momentum 0.9
Batch Size 1024
Image Size 224
Interpolation bicubic
SHOW MORE
SHOW LESS
legacy_seresnet34

Parameters 22 Million
FLOPs 5 Billion
File Size 83.88 MB
Training Data ImageNet
Training Resources 8x NVIDIA Titan X GPUs
Training Time

Training Techniques SGD with Momentum, Weight Decay, Label Smoothing
Architecture Squeeze-and-Excitation Block, 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID legacy_seresnet34
LR 0.6
Epochs 100
Layers 34
Dropout 0.2
Crop Pct 0.875
Momentum 0.9
Batch Size 1024
Image Size 224
Interpolation bilinear
SHOW MORE
SHOW LESS
legacy_seresnet50

Parameters 28 Million
FLOPs 5 Billion
File Size 107.39 MB
Training Data ImageNet
Training Resources 8x NVIDIA Titan X GPUs
Training Time

Training Techniques SGD with Momentum, Weight Decay, Label Smoothing
Architecture Squeeze-and-Excitation Block, 1x1 Convolution, Bottleneck Residual Block, Batch Normalization, Convolution, Global Average Pooling, Residual Block, Residual Connection, ReLU, Max Pooling, Softmax
ID legacy_seresnet50
LR 0.6
Epochs 100
Layers 50
Dropout 0.2
Crop Pct 0.875
Momentum 0.9
Image Size 224
Interpolation bilinear
Minibatch Size 1024
SHOW MORE
SHOW LESS
README.md

Summary

SE ResNet is a variant of a ResNet that employs squeeze-and-excitation blocks to enable the network to perform dynamic channel-wise feature recalibration.

How do I load this model?

To load a pretrained model:

import timm
m = timm.create_model('legacy_seresnet101', pretrained=True)
m.eval()

Replace the model name with the variant you want to use, e.g. legacy_seresnet101. You can find the IDs in the model summaries at the top of this page.

How do I train this model?

You can follow the timm recipe scripts for training a new model afresh.

Citation

@misc{hu2019squeezeandexcitation,
      title={Squeeze-and-Excitation Networks}, 
      author={Jie Hu and Li Shen and Samuel Albanie and Gang Sun and Enhua Wu},
      year={2019},
      eprint={1709.01507},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

Results

Image Classification on ImageNet

Image Classification on ImageNet
MODEL TOP 1 ACCURACY TOP 5 ACCURACY
legacy_seresnet152 78.67% 94.38%
legacy_seresnet101 78.38% 94.26%
legacy_seresnet50 77.64% 93.74%
legacy_seresnet34 74.79% 92.13%
legacy_seresnet18 71.74% 90.34%