Training Techniques | RMSProp, Weight Decay, Label Smoothing |
---|---|
Architecture | Depthwise Separable Convolution, Batch Normalization, ReLU, Average Pooling, Convolution, Dropout |
ID | pnasnet5large |
SHOW MORE |
Progressive Neural Architecture Search, or PNAS, is a method for learning the structure of convolutional neural networks (CNNs). It uses a sequential model-based optimization (SMBO) strategy, where we search the space of cell structures, starting with simple (shallow) models and progressing to complex ones, pruning out unpromising structures as we go.
To load a pretrained model:
import timm
m = timm.create_model('pnasnet5large', pretrained=True)
m.eval()
Replace the model name with the variant you want to use, e.g. pnasnet5large
. You can find the IDs in the model summaries at the top of this page.
You can follow the timm recipe scripts for training a new model afresh.
@misc{liu2018progressive,
title={Progressive Neural Architecture Search},
author={Chenxi Liu and Barret Zoph and Maxim Neumann and Jonathon Shlens and Wei Hua and Li-Jia Li and Li Fei-Fei and Alan Yuille and Jonathan Huang and Kevin Murphy},
year={2018},
eprint={1712.00559},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
BENCHMARK | MODEL | METRIC NAME | METRIC VALUE | GLOBAL RANK |
---|---|---|---|---|
ImageNet | pnasnet5large | Top 1 Accuracy | 0.98% | # 329 |
Top 5 Accuracy | 18.58% | # 329 |