Paper

A General Computational Framework to Measure the Expressiveness of Complex Networks Using a Tighter Upper Bound of Linear Regions

The expressiveness of deep neural network (DNN) is a perspective to understandthe surprising performance of DNN. The number of linear regions, i.e. pieces thata piece-wise-linear function represented by a DNN, is generally used to measurethe expressiveness. And the upper bound of regions number partitioned by a rec-tifier network, instead of the number itself, is a more practical measurement ofexpressiveness of a rectifier DNN. In this work, we propose a new and tighter up-per bound of regions number. Inspired by the proof of this upper bound and theframework of matrix computation in Hinz & Van de Geer (2019), we propose ageneral computational approach to compute a tight upper bound of regions numberfor theoretically any network structures (e.g. DNN with all kind of skip connec-tions and residual structures). Our experiments show our upper bound is tighterthan existing ones, and explain why skip connections and residual structures canimprove network performance.

Results in Papers With Code
(↓ scroll down to see all results)