Mutual Information Estimation

38 papers with code • 0 benchmarks • 0 datasets

To estimate mutual information from samples, specially for high-dimensional variables.

Most implemented papers

LSMI-Sinkhorn: Semi-supervised Mutual Information Estimation with Optimal Transport

csyanbin/LSMI-Sinkhorn 5 Sep 2019

To estimate the mutual information from data, a common practice is preparing a set of paired samples $\{(\mathbf{x}_i,\mathbf{y}_i)\}_{i=1}^n \stackrel{\mathrm{i. i. d.

Graph Representation Learning via Graphical Mutual Information Maximization

zpeng27/GMI 4 Feb 2020

The richness in the content of various information networks such as social networks and communication networks provides the unprecedented potential for learning high-quality expressive representations without external supervision.

On the Information Plane of Autoencoders

nicolasigor/entropy 15 May 2020

Recently, the Information Plane (IP) was proposed to analyze them, which is based on the information-theoretic concept of mutual information (MI).

copent: Estimating Copula Entropy and Transfer Entropy in R

majianthu/copent 27 May 2020

Copula Entropy is a mathematical concept defined by Ma and Sun for multivariate statistical independence measuring and testing, and also proved to be closely related to conditional independence (or transfer entropy).

Telescoping Density-Ratio Estimation

benrhodes26/tre_code NeurIPS 2020

Density-ratio estimation via classification is a cornerstone of unsupervised learning.

Reducing the Variance of Variational Estimates of Mutual Information by Limiting the Critic's Hypothesis Space to RKHS

blackPython/mi_estimator 17 Nov 2020

We argue that the high variance characteristic is due to the uncontrolled complexity of the critic's hypothesis space.

Unsupervised Domain Adaptation of a Pretrained Cross-Lingual Language Model

lijuntaopku/UFD 23 Nov 2020

Experimental results show that our proposed method achieves significant performance improvements over the state-of-the-art pretrained cross-lingual language model in the CLCD setting.

Neural Joint Entropy Estimation

YuvalShalev/NJEE 21 Dec 2020

Estimating the entropy of a discrete random variable is a fundamental problem in information theory and related fields.

MIND: Inductive Mutual Information Estimation, A Convex Maximum-Entropy Copula Approach

kxytechnologies/kxy-python 25 Feb 2021

We propose a novel estimator of the mutual information between two ordinal vectors $x$ and $y$.