Training Techniques | SGD with Momentum, Weight Decay |
---|---|
Architecture | 1x1 Convolution, Batch Normalization, Convolution, Dense Connections, Global Average Pooling, Grouped Convolution, ReLU |
ID | regnetx_002 |
SHOW MORE |
Training Techniques | SGD with Momentum, Weight Decay |
---|---|
Architecture | 1x1 Convolution, Batch Normalization, Convolution, Dense Connections, Global Average Pooling, Grouped Convolution, ReLU |
ID | regnetx_004 |
SHOW MORE |
Training Techniques | SGD with Momentum, Weight Decay |
---|---|
Architecture | 1x1 Convolution, Batch Normalization, Convolution, Dense Connections, Global Average Pooling, Grouped Convolution, ReLU |
ID | regnetx_006 |
SHOW MORE |
Training Techniques | SGD with Momentum, Weight Decay |
---|---|
Architecture | 1x1 Convolution, Batch Normalization, Convolution, Dense Connections, Global Average Pooling, Grouped Convolution, ReLU |
ID | regnetx_008 |
SHOW MORE |
Training Techniques | SGD with Momentum, Weight Decay |
---|---|
Architecture | 1x1 Convolution, Batch Normalization, Convolution, Dense Connections, Global Average Pooling, Grouped Convolution, ReLU |
ID | regnetx_016 |
SHOW MORE |
Training Techniques | SGD with Momentum, Weight Decay |
---|---|
Architecture | 1x1 Convolution, Batch Normalization, Convolution, Dense Connections, Global Average Pooling, Grouped Convolution, ReLU |
ID | regnetx_032 |
SHOW MORE |
Training Techniques | SGD with Momentum, Weight Decay |
---|---|
Architecture | 1x1 Convolution, Batch Normalization, Convolution, Dense Connections, Global Average Pooling, Grouped Convolution, ReLU |
ID | regnetx_040 |
SHOW MORE |
Training Techniques | SGD with Momentum, Weight Decay |
---|---|
Architecture | 1x1 Convolution, Batch Normalization, Convolution, Dense Connections, Global Average Pooling, Grouped Convolution, ReLU |
ID | regnetx_064 |
SHOW MORE |
Training Techniques | SGD with Momentum, Weight Decay |
---|---|
Architecture | 1x1 Convolution, Batch Normalization, Convolution, Dense Connections, Global Average Pooling, Grouped Convolution, ReLU |
ID | regnetx_080 |
SHOW MORE |
Training Techniques | SGD with Momentum, Weight Decay |
---|---|
Architecture | 1x1 Convolution, Batch Normalization, Convolution, Dense Connections, Global Average Pooling, Grouped Convolution, ReLU |
ID | regnetx_120 |
SHOW MORE |
Training Techniques | SGD with Momentum, Weight Decay |
---|---|
Architecture | 1x1 Convolution, Batch Normalization, Convolution, Dense Connections, Global Average Pooling, Grouped Convolution, ReLU |
ID | regnetx_160 |
SHOW MORE |
Training Techniques | SGD with Momentum, Weight Decay |
---|---|
Architecture | 1x1 Convolution, Batch Normalization, Convolution, Dense Connections, Global Average Pooling, Grouped Convolution, ReLU |
ID | regnetx_320 |
SHOW MORE |
RegNetX is a convolutional network design space with simple, regular models with parameters: depth $d$, initial width $w_{0} > 0$, and slope $w_{a} > 0$, and generates a different block width $u_{j}$ for each block $j < d$. The key restriction for the RegNet types of model is that there is a linear parameterisation of block widths (the design space only contains models with this linear structure):
$$ u_{j} = w_{0} + w_{a}\cdot{j} $$
For RegNetX we have additional restrictions: we set $b = 1$ (the bottleneck ratio), $12 \leq d \leq 28$, and $w_{m} \geq 2$ (the width multiplier).
To load a pretrained model:
import timm
m = timm.create_model('regnetx_002', pretrained=True)
m.eval()
Replace the model name with the variant you want to use, e.g. regnetx_002
. You can find the IDs in the model summaries at the top of this page.
You can follow the timm recipe scripts for training a new model afresh.
@misc{radosavovic2020designing,
title={Designing Network Design Spaces},
author={Ilija Radosavovic and Raj Prateek Kosaraju and Ross Girshick and Kaiming He and Piotr Dollár},
year={2020},
eprint={2003.13678},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
MODEL | TOP 1 ACCURACY | TOP 5 ACCURACY |
---|---|---|
regnetx_320 | 80.25% | 95.03% |
regnetx_160 | 79.84% | 94.82% |
regnetx_120 | 79.61% | 94.73% |
regnetx_080 | 79.21% | 94.55% |
regnetx_064 | 79.06% | 94.47% |
regnetx_040 | 78.48% | 94.25% |
regnetx_032 | 78.15% | 94.09% |
regnetx_016 | 76.95% | 93.43% |
regnetx_008 | 75.05% | 92.34% |
regnetx_006 | 73.84% | 91.68% |
regnetx_004 | 72.39% | 90.82% |
regnetx_002 | 68.75% | 88.56% |