PRE-NAS: Predictor-assisted Evolutionary Neural Architecture Search

27 Apr 2022  ·  Yameng Peng, Andy Song, Vic Ciesielski, Haytham M. Fayek, Xiaojun Chang ·

Neural architecture search (NAS) aims to automate architecture engineering in neural networks. This often requires a high computational overhead to evaluate a number of candidate networks from the set of all possible networks in the search space during the search. Prediction of the networks' performance can alleviate this high computational overhead by mitigating the need for evaluating every candidate network. Developing such a predictor typically requires a large number of evaluated architectures which may be difficult to obtain. We address this challenge by proposing a novel evolutionary-based NAS strategy, Predictor-assisted E-NAS (PRE-NAS), which can perform well even with an extremely small number of evaluated architectures. PRE-NAS leverages new evolutionary search strategies and integrates high-fidelity weight inheritance over generations. Unlike one-shot strategies, which may suffer from bias in the evaluation due to weight sharing, offspring candidates in PRE-NAS are topologically homogeneous, which circumvents bias and leads to more accurate predictions. Extensive experiments on NAS-Bench-201 and DARTS search spaces show that PRE-NAS can outperform state-of-the-art NAS methods. With only a single GPU searching for 0.6 days, competitive architecture can be found by PRE-NAS which achieves 2.40% and 24% test error rates on CIFAR-10 and ImageNet respectively.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Neural Architecture Search NAS-Bench-201, CIFAR-10 PRE-NAS Accuracy (Test) 94.04 # 16
Accuracy (Val) 91.37 # 12
Neural Architecture Search NAS-Bench-201, CIFAR-100 PRE-NAS Accuracy (Test) 72.02 # 17
Accuracy (Val) 71.95 # 17
Neural Architecture Search NAS-Bench-201, ImageNet-16-120 PRE-NAS Accuracy (Test) 45.34 # 26

Methods