Skip to yearly menu bar Skip to main content


Poster

Assessing Disparate Impact of Personalized Interventions: Identifiability and Bounds

Nathan Kallus · Angela Zhou

East Exhibition Hall B + C #72

Keywords: [ Causal Inference ] [ Probabilistic Methods ] [ Applications ] [ Fairness, Accountability, and Transparency ]


Abstract:

Personalized interventions in social services, education, and healthcare leverage individual-level causal effect predictions in order to give the best treatment to each individual or to prioritize program interventions for the individuals most likely to benefit. While the sensitivity of these domains compels us to evaluate the fairness of such policies, we show that actually auditing their disparate impacts per standard observational metrics, such as true positive rates, is impossible since ground truths are unknown. Whether our data is experimental or observational, an individual's actual outcome under an intervention different than that received can never be known, only predicted based on features. We prove how we can nonetheless point-identify these quantities under the additional assumption of monotone treatment response, which may be reasonable in many applications. We further provide a sensitivity analysis for this assumption via sharp partial-identification bounds under violations of monotonicity of varying strengths. We show how to use our results to audit personalized interventions using partially-identified ROC and xROC curves and demonstrate this in a case study of a French job training dataset.

Live content is unavailable. Log in and register to view live content