Training Techniques | RMSProp, Weight Decay, Label Smoothing |
---|---|
Architecture | Depthwise Separable Convolution, Batch Normalization, ReLU, Average Pooling, Convolution, Dropout |
ID | nasnetalarge |
SHOW MORE |
NASNet is a type of convolutional neural network discovered through neural architecture search. The building blocks consist of normal and reduction cells.
To load a pretrained model:
import timm
m = timm.create_model('nasnetalarge', pretrained=True)
m.eval()
Replace the model name with the variant you want to use, e.g. nasnetalarge
. You can find the IDs in the model summaries at the top of this page.
You can follow the timm recipe scripts for training a new model afresh.
@misc{zoph2018learning,
title={Learning Transferable Architectures for Scalable Image Recognition},
author={Barret Zoph and Vijay Vasudevan and Jonathon Shlens and Quoc V. Le},
year={2018},
eprint={1707.07012},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
BENCHMARK | MODEL | METRIC NAME | METRIC VALUE | GLOBAL RANK |
---|---|---|---|---|
ImageNet | nasnetalarge | Top 1 Accuracy | 82.63% | # 39 |
Top 5 Accuracy | 96.05% | # 39 |