Rethinking Softmax with Cross-Entropy: Neural Network Classifier as Mutual Information Estimator

25 Nov 2019  Â·  Zhenyue Qin, Dongwoo Kim, Tom Gedeon ·

Mutual information is widely applied to learn latent representations of observations, whilst its implication in classification neural networks remain to be better explained. We show that optimising the parameters of classification neural networks with softmax cross-entropy is equivalent to maximising the mutual information between inputs and labels under the balanced data assumption. Through experiments on synthetic and real datasets, we show that softmax cross-entropy can estimate mutual information approximately. When applied to image classification, this relation helps approximate the point-wise mutual information between an input image and a label without modifying the network structure. To this end, we propose infoCAM, informative class activation map, which highlights regions of the input image that are the most relevant to a given label based on differences in information. The activation map helps localise the target object in an input image. Through experiments on the semi-supervised object localisation task with two real-world datasets, we evaluate the effectiveness of our information-theoretic approach.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Weakly-Supervised Object Localization CUB-200-2011 InfoCAM Top-1 Error Rate 54.17 # 4
Top-1 Localization Accuracy 55.83 # 6
Fine-Grained Image Classification Imbalanced CUB-200-2011 PC-Softmax Average Per-Class Accuracy 87.69 # 1
Accuracy 89.73 # 1
Image Classification Imbalanced CUB-200-2011 PC-Softmax Average Per-Class Accuracy 87.69 # 1
Accuracy 89.73 # 2
Weakly-Supervised Object Localization Tiny ImageNet InfoCAM Top-1 Localization Accuracy 43.34 # 1

Methods