We present Fast-Downsampling MobileNet (FD-MobileNet), an efficient and
accurate network for very limited computational budgets (e.g., 10-140 MFLOPs).
Our key idea is applying an aggressive downsampling strategy to MobileNet
framework. In FD-MobileNet, we perform 32$\times$ downsampling within 12
layers, only half the layers in the original MobileNet. This design brings
three advantages: (i) It remarkably reduces the computational cost. (ii) It
increases the information capacity and achieves significant performance
improvements. (iii) It is engineering-friendly and provides fast actual
inference speed. Experiments on ILSVRC 2012 and PASCAL VOC 2007 datasets
demonstrate that FD-MobileNet consistently outperforms MobileNet and achieves
comparable results with ShuffleNet under different computational budgets, for
instance, surpassing MobileNet by 5.5% on the ILSVRC 2012 top-1 accuracy and
3.6% on the VOC 2007 mAP under a complexity of 12 MFLOPs. On an ARM-based
device, FD-MobileNet achieves 1.11$\times$ inference speedup over MobileNet and
1.82$\times$ over ShuffleNet under the same complexity.