Information distance for neural network functions

1 Jan 2021  ·  Xiao Zhang, Dejing Dou, Ji Wu ·

We provide a practical distance measure in the space of functions parameterized by neural networks. It is based on classic information distance and we replace the uncomputable Kolmogorov complexity in the original definition with information measured by codelength of prequential coding. Empirically, we show that information distance is invariant with respect to different parameterization of the functions with neural networks. We also verify that information distance can faithfully reflect similarities of neural network functions. Finally, we applied information distance to investigate the relationship between neural network models, and demonstrate the connection between information distance and multiple characteristics and behaviors of neural network models.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here