PNASNet

Last updated on Feb 14, 2021

pnasnet5large

Parameters 86 Million
FLOPs 31 Billion
File Size 329.16 MB
Training Data ImageNet
Training Resources 100x NVIDIA P100 GPUs
Training Time

Training Techniques RMSProp, Weight Decay, Label Smoothing
Architecture Depthwise Separable Convolution, Batch Normalization, ReLU, Average Pooling, Convolution, Dropout
ID pnasnet5large
LR 0.015
Dropout 0.5
Crop Pct 0.911
Momentum 0.9
Batch Size 1600
Image Size 331
Interpolation bicubic
Label Smoothing 0.1
SHOW MORE
SHOW LESS
README.md

Summary

Progressive Neural Architecture Search, or PNAS, is a method for learning the structure of convolutional neural networks (CNNs). It uses a sequential model-based optimization (SMBO) strategy, where we search the space of cell structures, starting with simple (shallow) models and progressing to complex ones, pruning out unpromising structures as we go.

How do I load this model?

To load a pretrained model:

import timm
m = timm.create_model('pnasnet5large', pretrained=True)
m.eval()

Replace the model name with the variant you want to use, e.g. pnasnet5large. You can find the IDs in the model summaries at the top of this page.

How do I train this model?

You can follow the timm recipe scripts for training a new model afresh.

Citation

@misc{liu2018progressive,
      title={Progressive Neural Architecture Search}, 
      author={Chenxi Liu and Barret Zoph and Maxim Neumann and Jonathon Shlens and Wei Hua and Li-Jia Li and Li Fei-Fei and Alan Yuille and Jonathan Huang and Kevin Murphy},
      year={2018},
      eprint={1712.00559},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

Results

Image Classification on ImageNet

Image Classification
BENCHMARK MODEL METRIC NAME METRIC VALUE GLOBAL RANK
ImageNet pnasnet5large Top 1 Accuracy 0.98% # 329
Top 5 Accuracy 18.58% # 329