Timezone: »

Applications of Common Entropy for Causal Inference
Murat Kocaoglu · Sanjay Shakkottai · Alexandros Dimakis · Constantine Caramanis · Sriram Vishwanath

Wed Dec 09 09:00 AM -- 11:00 AM (PST) @ Poster Session 3 #882

We study the problem of discovering the simplest latent variable that can make two observed discrete variables conditionally independent. The minimum entropy required for such a latent is known as common entropy in information theory. We extend this notion to Renyi common entropy by minimizing the Renyi entropy of the latent variable. To efficiently compute common entropy, we propose an iterative algorithm that can be used to discover the trade-off between the entropy of the latent variable and the conditional mutual information of the observed variables. We show two applications of common entropy in causal inference: First, under the assumption that there are no low-entropy mediators, it can be used to distinguish direct causation from spurious correlation among almost all joint distributions on simple causal graphs with two observed variables. Second, common entropy can be used to improve constraint-based methods such as PC or FCI algorithms in the small-sample regime, where these methods are known to struggle. We propose a modification to these constraint-based methods to assess if a separating set found by these algorithms are valid using common entropy. We finally evaluate our algorithms on synthetic and real data to establish their performance.

Author Information

Murat Kocaoglu (Purdue University)
Sanjay Shakkottai (University of Texas at Austin)
Alex Dimakis (University of Texas, Austin)
Constantine Caramanis (UT Austin)
Sriram Vishwanath (University of Texas at Austin)

More from the Same Authors