Fair DARTS: Eliminating Unfair Advantages in Differentiable Architecture Search

ECCV 2020  ·  Xiangxiang Chu, Tianbao Zhou, Bo Zhang, Jixiang Li ·

Differentiable Architecture Search (DARTS) is now a widely disseminated weight-sharing neural architecture search method. However, it suffers from well-known performance collapse due to an inevitable aggregation of skip connections. In this paper, we first disclose that its root cause lies in an unfair advantage in exclusive competition. Through experiments, we show that if either of two conditions is broken, the collapse disappears. Thereby, we present a novel approach called Fair DARTS where the exclusive competition is relaxed to be collaborative. Specifically, we let each operation's architectural weight be independent of others. Yet there is still an important issue of discretization discrepancy. We then propose a zero-one loss to push architectural weights towards zero or one, which approximates an expected multi-hot solution. Our experiments are performed on two mainstream search spaces, and we derive new state-of-the-art results on CIFAR-10 and ImageNet. Our code is available on https://github.com/xiaomi-automl/fairdarts .

PDF Abstract ECCV 2020 PDF ECCV 2020 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Neural Architecture Search CIFAR-10 FairDARTS-a Top-1 Error Rate 2.54% # 24
Search Time (GPU days) 0.25 # 8
Parameters 2.8M # 20
FLOPS 746M # 40
Neural Architecture Search ImageNet FairDARTS-C Top-1 Error Rate 22.8 # 68
MACs 386M # 109

Methods