69 papers with code • 0 benchmarks • 1 datasets
Estimation of causal effects involves crucial assumptions about the data-generating process, such as directionality of effect, presence of instrumental variables or mediators, and whether all relevant confounders are observed.
This paper presents a new open source Python framework for causal discovery from observational data and domain background knowledge, aimed at causal graph and causal mechanism modeling.
We show that existing causal discovery methods such as FCI and variants suffer from low recall in the autocorrelated time series case and identify low effect size of conditional independence tests as the main reason.
Discovering contemporaneous and lagged causal relations in autocorrelated nonlinear time series datasets
We consider causal discovery from time series using conditional independence (CI) based network learning algorithms such as the PC algorithm.
Conditional independence testing based on a nearest-neighbor estimator of conditional mutual information
Combining the local permutation scheme with the kernel tests leads to better calibration, but suffers in power.
It is a long-standing question to discover causal relations among a set of variables in many empirical sciences.
We next utilize the augmented form to develop a masked structure learning method that can be efficiently trained using gradient-based optimization methods, by leveraging a smooth characterization on acyclicity and the Gumbel-Softmax approach to approximate the binary adjacency matrix.