Evolving Transferable Neural Pruning Functions

21 Oct 2021  ·  Yuchen Liu, S. Y. Kung, David Wentzlaff ·

Structural design of neural networks is crucial for the success of deep learning. While most prior works in evolutionary learning aim at directly searching the structure of a network, few attempts have been made on another promising track, channel pruning, which recently has made major headway in designing efficient deep learning models. In fact, prior pruning methods adopt human-made pruning functions to score a channel's importance for channel pruning, which requires domain knowledge and could be sub-optimal. To this end, we pioneer the use of genetic programming (GP) to discover strong pruning metrics automatically. Specifically, we craft a novel design space to express high-quality and transferable pruning functions, which ensures an end-to-end evolution process where no manual modification is needed on the evolved functions for their transferability after evolution. Unlike prior methods, our approach can provide both compact pruned networks for efficient inference and novel closed-form pruning metrics which are mathematically explainable and thus generalizable to different pruning tasks. While the evolution is conducted on small datasets, our functions shows promising results when applied to more challenging datasets, different from those used in the evolution process. For example, on ILSVRC-2012, an evolved function achieves state-of-the-art pruning results.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods