Mutual Information Estimation
38 papers with code • 0 benchmarks • 0 datasets
To estimate mutual information from samples, specially for high-dimensional variables.
Benchmarks
These leaderboards are used to track progress in Mutual Information Estimation
Most implemented papers
LSMI-Sinkhorn: Semi-supervised Mutual Information Estimation with Optimal Transport
To estimate the mutual information from data, a common practice is preparing a set of paired samples $\{(\mathbf{x}_i,\mathbf{y}_i)\}_{i=1}^n \stackrel{\mathrm{i. i. d.
Learning Disentangled Representations via Mutual Information Estimation
In this paper, we investigate the problem of learning disentangled representations.
Graph Representation Learning via Graphical Mutual Information Maximization
The richness in the content of various information networks such as social networks and communication networks provides the unprecedented potential for learning high-quality expressive representations without external supervision.
On the Information Plane of Autoencoders
Recently, the Information Plane (IP) was proposed to analyze them, which is based on the information-theoretic concept of mutual information (MI).
copent: Estimating Copula Entropy and Transfer Entropy in R
Copula Entropy is a mathematical concept defined by Ma and Sun for multivariate statistical independence measuring and testing, and also proved to be closely related to conditional independence (or transfer entropy).
Telescoping Density-Ratio Estimation
Density-ratio estimation via classification is a cornerstone of unsupervised learning.
Reducing the Variance of Variational Estimates of Mutual Information by Limiting the Critic's Hypothesis Space to RKHS
We argue that the high variance characteristic is due to the uncontrolled complexity of the critic's hypothesis space.
Unsupervised Domain Adaptation of a Pretrained Cross-Lingual Language Model
Experimental results show that our proposed method achieves significant performance improvements over the state-of-the-art pretrained cross-lingual language model in the CLCD setting.
Neural Joint Entropy Estimation
Estimating the entropy of a discrete random variable is a fundamental problem in information theory and related fields.
MIND: Inductive Mutual Information Estimation, A Convex Maximum-Entropy Copula Approach
We propose a novel estimator of the mutual information between two ordinal vectors $x$ and $y$.