A First Step Towards Runtime Analysis of Evolutionary Neural Architecture Search

22 Jan 2024  ·  Zeqiong Lv, Chao Qian, Yanan sun ·

Evolutionary neural architecture search (ENAS) employs evolutionary algorithms to find high-performing neural architectures automatically, and has achieved great success. However, compared to the empirical success, its rigorous theoretical analysis has yet to be touched. This work goes preliminary steps toward the mathematical runtime analysis of ENAS. In particular, we define a binary classification problem $\textsc{UNIFORM}$, and formulate an explicit fitness function to represent the relationship between neural architecture and classification accuracy. Furthermore, we consider (1+1)-ENAS algorithm with mutation to optimize the neural architecture, and obtain the following runtime bounds: both the local and global mutations find the optimum in an expected runtime of $\Theta(n)$, where $n$ is the problem size. The theoretical results show that the local and global mutations achieve nearly the same performance on $\textsc{UNIFORM}$. Empirical results also verify the equivalence of these two mutation operators.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here