Understanding, Analyzing, and Optimizing the Complexity of Deep Models
This paper aims to evaluate and analyze the complexity of feature transformations encoded in DNNs. We propose metrics to measure three types of complexity of transformations based on the information theory. We further discover and prove the negative correlation between the complexity and the disentanglement of transformations. Based on the proposed metrics, we analyze two typical phenomena of the change of the transformation complexity during the training process, and explore the ceiling of a DNN’s complexity. The proposed metrics can also be used as a loss to learn a DNN with the minimum complexity, which also controls the significance of over-fitting of the DNN. Comprehensive comparative studies have provided new perspectives to understand the DNN. We will release the code when the paper is accepted.
PDF Abstract