no code implementations • 22 Jan 2023 • Murat Kocaoglu
To address this, we propose using conditional independence tests where the size of the conditioning set is upper bounded by some integer $k$ for robust causal discovery.
no code implementations • NeurIPS 2020 • Spencer Compton, Murat Kocaoglu, Kristjan Greenewald, Dmitriy Katz
This unobserved randomness is measured by the entropy of the exogenous variable in the underlying structural causal model, which governs the causal relation between the observed variables.
no code implementations • NeurIPS 2020 • Amin Jaber, Murat Kocaoglu, Karthikeyan Shanmugam, Elias Bareinboim
One fundamental problem in the empirical sciences is of reconstructing the causal structure that underlies a phenomenon of interest through observation and experimentation.
1 code implementation • NeurIPS 2020 • Chandler Squires, Sara Magliacane, Kristjan Greenewald, Dmitriy Katz, Murat Kocaoglu, Karthikeyan Shanmugam
Most existing works focus on \textit{worst-case} or \textit{average-case} lower bounds for the number of interventions required to orient a DAG.
3 code implementations • 1 Nov 2020 • Chandler Squires, Sara Magliacane, Kristjan Greenewald, Dmitriy Katz, Murat Kocaoglu, Karthikeyan Shanmugam
Most existing works focus on worst-case or average-case lower bounds for the number of interventions required to orient a DAG.
no code implementations • NeurIPS 2019 • Murat Kocaoglu, Amin Jaber, Karthikeyan Shanmugam, Elias Bareinboim
We introduce a novel notion of interventional equivalence class of causal graphs with latent variables based on these invariances, which associates each graphical structure with a set of interventional distributions that respect the do-calculus rules.
no code implementations • NeurIPS 2019 • Kristjan Greenewald, Dmitriy Katz, Karthikeyan Shanmugam, Sara Magliacane, Murat Kocaoglu, Enric Boix Adsera, Guy Bresler
We consider the problem of experimental design for learning causal graphs that have a tree structure.
no code implementations • NeurIPS 2018 • Erik M. Lindgren, Murat Kocaoglu, Alexandros G. Dimakis, Sriram Vishwanath
We consider the minimum cost intervention design problem: Given the essential graph of a causal graph and a cost to intervene on a variable, identify the set of interventions with minimum total cost that can learn any causal graph with the given essential graph.
no code implementations • NeurIPS 2020 • Murat Kocaoglu, Sanjay Shakkottai, Alexandros G. Dimakis, Constantine Caramanis, Sriram Vishwanath
We study the problem of discovering the simplest latent variable that can make two observed discrete variables conditionally independent.
no code implementations • NeurIPS 2017 • Murat Kocaoglu, Karthikeyan Shanmugam, Elias Bareinboim
Next, we propose an algorithm that uses only O(d^2 log n) interventions that can learn the latents between both non-adjacent and adjacent variables.
2 code implementations • ICLR 2018 • Murat Kocaoglu, Christopher Snyder, Alexandros G. Dimakis, Sriram Vishwanath
We show that adversarial training can be used to learn a generative model with true observational and interventional distributions if the generator architecture is consistent with the given causal graph.
no code implementations • 8 Mar 2017 • Karthikeyan Shanmugam, Murat Kocaoglu, Alexandros G. Dimakis, Sujay Sanghavi
We consider support recovery in the quadratic logistic regression setting - where the target depends on both p linear terms $x_i$ and up to $p^2$ quadratic terms $x_i x_j$.
no code implementations • ICML 2017 • Murat Kocaoglu, Alexandros G. Dimakis, Sriram Vishwanath
We consider the problem of learning a causal graph over a set of variables with interventions.
no code implementations • 28 Jan 2017 • Murat Kocaoglu, Alexandros G. Dimakis, Sriram Vishwanath, Babak Hassibi
This framework requires the solution of a minimum entropy coupling problem: Given marginal distributions of m discrete random variables, each on n states, find the joint distribution with minimum entropy, that respects the given marginals.
1 code implementation • 12 Nov 2016 • Murat Kocaoglu, Alexandros G. Dimakis, Sriram Vishwanath, Babak Hassibi
We show that the problem of finding the exogenous variable with minimum entropy is equivalent to the problem of finding minimum joint entropy given $n$ marginal distributions, also known as minimum entropy coupling problem.
no code implementations • 1 Jun 2016 • Rajat Sen, Karthikeyan Shanmugam, Murat Kocaoglu, Alexandros G. Dimakis, Sanjay Shakkottai
Our algorithm achieves a regret of $\mathcal{O}\left(L\mathrm{poly}(m, \log K) \log T \right)$ at time $T$, as compared to $\mathcal{O}(LK\log T)$ for conventional contextual bandits, assuming a constant gap between the best arm and the rest for each context.
2 code implementations • NeurIPS 2015 • Karthikeyan Shanmugam, Murat Kocaoglu, Alexandros G. Dimakis, Sriram Vishwanath
We prove that any deterministic adaptive algorithm needs to be a separating system in order to learn complete graphs in the worst case.
no code implementations • NeurIPS 2014 • Murat Kocaoglu, Karthikeyan Shanmugam, Alexandros G. Dimakis, Adam Klivans
We give an algorithm for exactly reconstructing f given random examples from the uniform distribution on $\{-1, 1\}^n$ that runs in time polynomial in $n$ and $2s$ and succeeds if the function satisfies the unique sign property: there is one output value which corresponds to a unique set of values of the participating parities.