no code implementations • 18 Apr 2018 • Rana Ali Amjad, Kairen Liu, Bernhard C. Geiger
In this work, we investigate the use of three information-theoretic quantities -- entropy, mutual information with the class variable, and a class selectivity measure based on Kullback-Leibler divergence -- to understand and study the behavior of already trained fully-connected feed-forward neural networks.