Guided Evolutionary Neural Architecture Search With Efficient Performance Estimation

22 Jul 2022  ·  Vasco Lopes, Miguel Santos, Bruno Degardin, Luís A. Alexandre ·

Neural Architecture Search (NAS) methods have been successfully applied to image tasks with excellent results. However, NAS methods are often complex and tend to converge to local minima as soon as generated architectures seem to yield good results. This paper proposes GEA, a novel approach for guided NAS. GEA guides the evolution by exploring the search space by generating and evaluating several architectures in each generation at initialisation stage using a zero-proxy estimator, where only the highest-scoring architecture is trained and kept for the next generation. Subsequently, GEA continuously extracts knowledge about the search space without increased complexity by generating several off-springs from an existing architecture at each generation. More, GEA forces exploitation of the most performant architectures by descendant generation while simultaneously driving exploration through parent mutation and favouring younger architectures to the detriment of older ones. Experimental results demonstrate the effectiveness of the proposed method, and extensive ablation studies evaluate the importance of different parameters. Results show that GEA achieves state-of-the-art results on all data sets of NAS-Bench-101, NAS-Bench-201 and TransNAS-Bench-101 benchmarks.

PDF Abstract
No code implementations yet. Submit your code now
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Neural Architecture Search NAS-Bench-201, CIFAR-10 GAE Accuracy (Test) 93.99 # 18
Accuracy (Val) 91.26 # 15
Neural Architecture Search NAS-Bench-201, CIFAR-100 GAE Accuracy (Test) 72.36 # 16
Accuracy (Val) 72.62 # 13
Neural Architecture Search NAS-Bench-201, ImageNet-16-120 GEA Accuracy (Test) 46.04 # 19

Methods


No methods listed for this paper. Add relevant methods here