Improving Neural Architecture Search by Mixing a FireFly algorithm with a Training Free Evaluation

Neural Architecture Search (NAS) algorithms are used to automate the design of deep neural networks. Finding the best architecture for a given dataset can be time consuming since these algorithms have to explore a large number of networks, and score them according to their performances to choose the most appropriate one. In this work, we propose a novel metric that uses the Intra-Cluster Distance (ICD) score to evaluate the ability of an untrained model to distinguish between data in order to approximate its quality. We also use an improved version of the FireFly algorithm, more robust to the local optimums problem than the baseline FireFly algorithm, as a search technique to find the best neural network model adapted to a specific dataset. Experimental results on the different NAS Benchmarks show that our metric is valid for either scoring CNNs and RNNs, and that our proposed FireFly algorithm can improve the result obtained by the state-of-art training-free methods.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Neural Architecture Search NAS-Bench-101 FireFly Accuracy (%) 93% # 4
Neural Architecture Search NAS-Bench-101 Improved FireFly Algorithme Accuracy (%) 94.03% # 3
Neural Architecture Search NAS-Bench-201 Improved FireFly Algorithme Accuracy (%) 93.58% # 1
Neural Architecture Search NAS-Bench-201 FireFly Accuracy (%) 92.90% # 2

Methods