Paper

NPENAS: Neural Predictor Guided Evolution for Neural Architecture Search

Neural architecture search (NAS) is a promising method for automatically design neural architectures. NAS adopts a search strategy to explore the predefined search space to find outstanding performance architecture with the minimum searching costs. Bayesian optimization and evolutionary algorithms are two commonly used search strategies, but they suffer from computationally expensive, challenge to implement or inefficient exploration ability. In this paper, we propose a neural predictor guided evolutionary algorithm to enhance the exploration ability of EA for NAS (NPENAS) and design two kinds of neural predictors. The first predictor is defined from Bayesian optimization and we propose a graph-based uncertainty estimation network as a surrogate model that is easy to implement and computationally efficient. The second predictor is a graph-based neural network that directly outputs the performance prediction of the input neural architecture. The NPENAS using the two neural predictors are denoted as NPENAS-BO and NPENAS-NP respectively. In addition, we introduce a new random architecture sampling method to overcome the drawbacks of the existing sampling method. Extensive experiments demonstrate the superiority of NPENAS. Quantitative results on three NAS search spaces indicate that both NPENAS-BO and NPENAS-NP outperform most existing NAS algorithms, with NPENAS-BO achieving state-of-the-art performance on NASBench-201 and NPENAS-NP on NASBench-101 and DARTS, respectively.

Results in Papers With Code
(↓ scroll down to see all results)