Training Techniques | SGD with Momentum, Weight Decay, Label Smoothing |
---|---|
Architecture | 1x1 Convolution, Squeeze-and-Excitation Block, Batch Normalization, Convolution, Grouped Convolution, Global Average Pooling, ResNeXt Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | legacy_seresnext101_32x4d |
SHOW MORE |
Training Techniques | SGD with Momentum, Weight Decay, Label Smoothing |
---|---|
Architecture | 1x1 Convolution, Squeeze-and-Excitation Block, Batch Normalization, Convolution, Grouped Convolution, Global Average Pooling, ResNeXt Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | legacy_seresnext26_32x4d |
SHOW MORE |
Training Techniques | SGD with Momentum, Weight Decay, Label Smoothing |
---|---|
Architecture | 1x1 Convolution, Squeeze-and-Excitation Block, Batch Normalization, Convolution, Grouped Convolution, Global Average Pooling, ResNeXt Block, Residual Connection, ReLU, Max Pooling, Softmax |
ID | legacy_seresnext50_32x4d |
SHOW MORE |
SE ResNeXt is a variant of a ResNeXt that employs squeeze-and-excitation blocks to enable the network to perform dynamic channel-wise feature recalibration.
To load a pretrained model:
import timm
m = timm.create_model('legacy_seresnet152', pretrained=True)
m.eval()
Replace the model name with the variant you want to use, e.g. legacy_seresnet152
. You can find the IDs in the model summaries at the top of this page.
You can follow the timm recipe scripts for training a new model afresh.
@misc{hu2019squeezeandexcitation,
title={Squeeze-and-Excitation Networks},
author={Jie Hu and Li Shen and Samuel Albanie and Gang Sun and Enhua Wu},
year={2019},
eprint={1709.01507},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
BENCHMARK | MODEL | METRIC NAME | METRIC VALUE | GLOBAL RANK |
---|---|---|---|---|
ImageNet | legacy_seresnext101_32x4d | Top 1 Accuracy | 80.23% | # 93 |
Top 5 Accuracy | 95.02% | # 93 | ||
ImageNet | legacy_seresnext50_32x4d | Top 1 Accuracy | 79.08% | # 133 |
Top 5 Accuracy | 94.43% | # 133 | ||
ImageNet | legacy_seresnext26_32x4d | Top 1 Accuracy | 77.11% | # 193 |
Top 5 Accuracy | 93.31% | # 193 |