Automatic Subspace Evoking for Efficient Neural Architecture Search

31 Oct 2022  ·  Yaofo Chen, Yong Guo, Daihai Liao, Fanbing Lv, Hengjie Song, Mingkui Tan ·

Neural Architecture Search (NAS) aims to automatically find effective architectures from a predefined search space. However, the search space is often extremely large. As a result, directly searching in such a large search space is non-trivial and also very time-consuming. To address the above issues, in each search step, we seek to limit the search space to a small but effective subspace to boost both the search performance and search efficiency. To this end, we propose a novel Neural Architecture Search method via Automatic Subspace Evoking (ASE-NAS) that finds promising architectures in automatically evoked subspaces. Specifically, we first perform a global search, i.e., automatic subspace evoking, to evoke/find a good subspace from a set of candidates. Then, we perform a local search within the evoked subspace to find an effective architecture. More critically, we further boost search performance by taking well-designed/searched architectures as the initial candidate subspaces. Extensive experiments show that our ASE-NAS not only greatly reduces the search cost but also finds better architectures than state-of-the-art methods in various benchmark search spaces.

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Neural Architecture Search NAS-Bench-201, CIFAR-100 ASE-NAS+ Accuracy (Val) 73.12 # 10
Neural Architecture Search NAS-Bench-201, ImageNet-16-120 ASE-NAS+ Accuracy (Val) 46.66 # 2

Methods


No methods listed for this paper. Add relevant methods here