Timezone: »
We argue that regularizing terms in standard regression methods not only help against overfitting finite data, but sometimes also help in getting better causal models. We first consider a multi-dimensional variable linearly influencing a target variable with some multi-dimensional unobserved common cause, where the confounding effect can be decreased by keeping the penalizing term in Ridge and Lasso regression even in the population limit. The reason is a close analogy between overfitting and confounding observed for our toy model. In the case of overfitting, we can choose regularization constants via cross validation, but here we choose the regularization constant by first estimating the strength of confounding, which yielded reasonable results for simulated and real data. Further, we show a ‘causal generalization bound’ which states (subject to our particular model of confounding) that the error made by interpreting any non-linear regression as causal model can be bounded from above whenever functions are taken from a not too rich class.
Author Information
Dominik Janzing (Amazon)
More from the Same Authors
-
2022 : Quantifying Causal Contribution in Rare Event Data »
Caner Turkmen · Dominik Janzing · Oleksandr Shchur · Lenon Minorics · Laurent Callot -
2020 : Keynotes: Dominik Janzing »
Dominik Janzing -
2019 Poster: Perceiving the arrow of time in autoregressive motion »
Kristof Meding · Dominik Janzing · Bernhard Schölkopf · Felix A. Wichmann -
2019 Poster: Selecting causal brain features with a single conditional independence test per feature »
Atalanti Mastakouri · Bernhard Schölkopf · Dominik Janzing -
2019 Spotlight: Perceiving the arrow of time in autoregressive motion »
Kristof Meding · Dominik Janzing · Bernhard Schölkopf · Felix A. Wichmann