Mutual Information Continuity-constrained Estimator

29 Sep 2021  ·  Tsun-An Hsieh, Cheng Yu, Ying Hung, Chung-Ching Lin, Yu Tsao ·

The estimation of mutual information (MI) is vital to a variety of applications in machine learning. Recent developments in neural approaches have shown encouraging potential in estimating the MI between high-dimensional variables based on their latent representations. However, these estimators are prone to high variances owing to the inevitable outlier events. Recent approaches mitigate the outlier issue by smoothing the partition function using clipping or averaging strategies; however, these estimators either break the lower bound condition or sacrifice the level of accuracy. Accordingly, we propose Mutual Information Continuity-constrained Estimator (MICE). MICE alternatively smooths the partition function by constraining the Lipschitz constant of the log-density ratio estimator, thus alleviating the induced variances without clipping or averaging. Our proposed estimator outperforms most of the existing estimators in terms of bias and variance in the standard benchmark. In addition, we propose an experiment extension based on the standard benchmark, where variables are drawn from a multivariate normal distribution with correlations between each sample in a batch. The experimental results imply that when the i.i.d. assumption is unfulfilled, our proposed estimator can be more accurate than the existing approaches in which the MI tends to be underestimated. Finally, we demonstrate that MICE mitigates mode collapse in the kernel density estimation task.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here