NASNet

Last updated on Feb 14, 2021

nasnetalarge

Parameters 89 Million
FLOPs 30 Billion
File Size 339.56 MB
Training Data ImageNet
Training Resources 50x Tesla K40 GPUs
Training Time

Training Techniques RMSProp, Weight Decay, Label Smoothing
Architecture Depthwise Separable Convolution, Batch Normalization, ReLU, Average Pooling, Convolution, Dropout
ID nasnetalarge
Dropout 0.5
Crop Pct 0.911
Momentum 0.9
Image Size 331
Interpolation bicubic
Label Smoothing 0.1
RMSProp $\epsilon$ 1.0
SHOW MORE
SHOW LESS
README.md

Summary

NASNet is a type of convolutional neural network discovered through neural architecture search. The building blocks consist of normal and reduction cells.

How do I load this model?

To load a pretrained model:

import timm
m = timm.create_model('nasnetalarge', pretrained=True)
m.eval()

Replace the model name with the variant you want to use, e.g. nasnetalarge. You can find the IDs in the model summaries at the top of this page.

How do I train this model?

You can follow the timm recipe scripts for training a new model afresh.

Citation

@misc{zoph2018learning,
      title={Learning Transferable Architectures for Scalable Image Recognition}, 
      author={Barret Zoph and Vijay Vasudevan and Jonathon Shlens and Quoc V. Le},
      year={2018},
      eprint={1707.07012},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

Results

Image Classification on ImageNet

Image Classification
BENCHMARK MODEL METRIC NAME METRIC VALUE GLOBAL RANK
ImageNet nasnetalarge Top 1 Accuracy 82.63% # 39
Top 5 Accuracy 96.05% # 39