Information distance for neural network functions
We provide a practical distance measure in the space of functions parameterized by neural networks. It is based on classic information distance and we replace the uncomputable Kolmogorov complexity in the original definition with information measured by codelength of prequential coding. Empirically, we show that information distance is invariant with respect to different parameterization of the functions with neural networks. We also verify that information distance can faithfully reflect similarities of neural network functions. Finally, we applied information distance to investigate the relationship between neural network models, and demonstrate the connection between information distance and multiple characteristics and behaviors of neural network models.
PDF Abstract