Bayesian Sensitivity of Causal Inference Estimators under Evidence-Based Priors
Nikita Dhawan · Daniel Shen · Leonardo Cotta · Chris Maddison
Abstract
Causal inference, especially in observational studies, relies on untestable assumptions about the true data-generating process. Sensitivity analysis helps us assess how robust our conclusions are when we alter underlying assumptions. Existing frameworks concerned worst-case changes in these assumptions. In this work, we argue that using such pessimistic criteria can often become uninformative or lead to conclusions contradicting our prior knowledge about the world. We generalize the recent $s$-value framework to estimate the sensitivity of three common assumptions in causal inference and empirically demonstrate this claim. Empirically, we find that, indeed, worst-case conclusions about sensitivity can rely on unrealistic changes in the data-generating process. To overcome this limitation, we extend the $s$-value framework with a new criterion, the Bayesian Sensitivity Value (BSV), which computes the expected sensitivity of an estimate to assumption violations under priors constructed from real-world evidence.
Video
Chat is not available.
Successful Page Load