Training Techniques | SGD with Momentum, Weight Decay |
---|---|
Architecture | 1x1 Convolution, Convolution, Dense Connections, Dropout, FBNet Block, Global Average Pooling, Softmax |
ID | fbnetc_100 |
SHOW MORE |
FBNet is a type of convolutional neural architectures discovered through DNAS neural architecture search. It utilises a basic type of image model block inspired by MobileNetv2 that utilises depthwise convolutions and an inverted residual structure (see components).
The principal building block is the FBNet Block.
To load a pretrained model:
import timm
m = timm.create_model('fbnetc_100', pretrained=True)
m.eval()
Replace the model name with the variant you want to use, e.g. fbnetc_100
. You can find the IDs in the model summaries at the top of this page.
You can follow the timm recipe scripts for training a new model afresh.
@misc{wu2019fbnet,
title={FBNet: Hardware-Aware Efficient ConvNet Design via Differentiable Neural Architecture Search},
author={Bichen Wu and Xiaoliang Dai and Peizhao Zhang and Yanghan Wang and Fei Sun and Yiming Wu and Yuandong Tian and Peter Vajda and Yangqing Jia and Kurt Keutzer},
year={2019},
eprint={1812.03443},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
BENCHMARK | MODEL | METRIC NAME | METRIC VALUE | GLOBAL RANK |
---|---|---|---|---|
ImageNet | fbnetc_100 | Top 1 Accuracy | 75.12% | # 227 |
Top 5 Accuracy | 92.37% | # 227 |