When NAS Meets Trees: An Efficient Algorithm for Neural Architecture Search

The key challenge in neural architecture search (NAS) is designing how to explore wisely in the huge search space. We propose a new NAS method called TNAS (NAS with trees), which improves search efficiency by exploring only a small number of architectures while also achieving a higher search accuracy. TNAS introduces an architecture tree and a binary operation tree, to factorize the search space and substantially reduce the exploration size. TNAS performs a modified bi-level Breadth-First Search in the proposed trees to discover a high-performance architecture. Impressively, TNAS finds the global optimal architecture on CIFAR-10 with test accuracy of 94.37\% in four GPU hours in NAS-Bench-201. The average test accuracy is 94.35\%, which outperforms the state-of-the-art. Code is available at: \url{https://github.com/guochengqian/TNAS}.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Neural Architecture Search NAS-Bench-201, CIFAR-10 TNAS Accuracy (Test) 94.35 # 7
Neural Architecture Search NAS-Bench-201, CIFAR-100 TNAS Accuracy (Test) 73.02 # 11
Neural Architecture Search NAS-Bench-201, ImageNet-16-120 TNAS Accuracy (Test) 46.31 # 15

Methods


No methods listed for this paper. Add relevant methods here