Architecture | Batch Normalization, Dense Connections, Dropout, Global Average Pooling, Grouped Convolution, MixConv, Squeeze-and-Excitation Block, Swish |
---|---|
ID | tf_mixnet_l |
SHOW MORE |
Architecture | Batch Normalization, Dense Connections, Dropout, Global Average Pooling, Grouped Convolution, MixConv, Squeeze-and-Excitation Block, Swish |
---|---|
ID | tf_mixnet_m |
SHOW MORE |
Architecture | Batch Normalization, Dense Connections, Dropout, Global Average Pooling, Grouped Convolution, MixConv, Squeeze-and-Excitation Block, Swish |
---|---|
ID | tf_mixnet_s |
SHOW MORE |
MixNet is a type of convolutional neural network discovered via AutoML that utilises MixConvs instead of regular depthwise convolutions.
To load a pretrained model:
import timm
m = timm.create_model('tf_mixnet_s', pretrained=True)
m.eval()
Replace the model name with the variant you want to use, e.g. tf_mixnet_s
. You can find the IDs in the model summaries at the top of this page.
You can follow the timm recipe scripts for training a new model afresh.
@misc{tan2019mixconv,
title={MixConv: Mixed Depthwise Convolutional Kernels},
author={Mingxing Tan and Quoc V. Le},
year={2019},
eprint={1907.09595},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
BENCHMARK | MODEL | METRIC NAME | METRIC VALUE | GLOBAL RANK |
---|---|---|---|---|
ImageNet | tf_mixnet_l | Top 1 Accuracy | 78.78% | # 143 |
Top 5 Accuracy | 94.0% | # 143 | ||
ImageNet | tf_mixnet_m | Top 1 Accuracy | 76.96% | # 198 |
Top 5 Accuracy | 93.16% | # 198 | ||
ImageNet | tf_mixnet_s | Top 1 Accuracy | 75.68% | # 218 |
Top 5 Accuracy | 92.64% | # 218 |