Search Results for author: Daniel Chicharro

Found 6 papers, 2 papers with code

Quantifying multivariate redundancy with maximum entropy decompositions of mutual information

2 code implementations13 Aug 2017 Daniel Chicharro

Williams and Beer (2010) proposed a nonnegative mutual information decomposition, based on the construction of redundancy lattices, which allows separating the information that a set of variables contains about a target variable into nonnegative components interpretable as the unique information of some variables not provided by others as well as redundant and synergistic components.

MAXENT3D_PID: An Estimator for the Maximum-entropy Trivariate Partial Information Decomposition

2 code implementations10 Jan 2019 Abdullah Makkeh, Daniel Chicharro, Dirk Oliver Theis, Raul Vicente

Chicharro (2017) introduced a procedure to determine multivariate partial information measures within the maximum entropy framework, separating unique, redundant, and synergistic components of information.

Computation Optimization and Control

The identity of information: how deterministic dependencies constrain information synergy and redundancy

no code implementations13 Nov 2017 Daniel Chicharro, Giuseppe Pica, Stefano Panzeri

Harder et al. (2013) proposed an identity axiom stating that there cannot be redundancy between two independent sources about a copy of themselves.

Invariant components of synergy, redundancy, and unique information among three variables

no code implementations27 Jun 2017 Giuseppe Pica, Eugenio Piasini, Daniel Chicharro, Stefano Panzeri

In a system of three stochastic variables, the Partial Information Decomposition (PID) of Williams and Beer dissects the information that two variables (sources) carry about a third variable (target) into nonnegative information atoms that describe redundant, unique, and synergistic modes of dependencies among the variables.

Conditionally-additive-noise Models for Structure Learning

no code implementations20 May 2019 Daniel Chicharro, Stefano Panzeri, Ilya Shpitser

Methods based on additive-noise (AN) models have been proposed to further discriminate between causal structures that are equivalent in terms of conditional independencies.

regression

Causal learning with sufficient statistics: an information bottleneck approach

no code implementations12 Oct 2020 Daniel Chicharro, Michel Besserve, Stefano Panzeri

Using these statistics we formulate new additional rules of causal orientation that provide causal information not obtainable from standard structure learning algorithms, which exploit only conditional independencies between observable variables.

Dimensionality Reduction

Cannot find the paper you are looking for? You can Submit a new open access paper.