Timezone: »

Fair Performance Metric Elicitation
Gaurush Hiranandani · Harikrishna Narasimhan · Sanmi Koyejo

Wed Dec 09 09:00 AM -- 11:00 AM (PST) @ Poster Session 3 #866

What is a fair performance metric? We consider the choice of fairness metrics through the lens of metric elicitation -- a principled framework for selecting performance metrics that best reflect implicit preferences. The use of metric elicitation enables a practitioner to tune the performance and fairness metrics to the task, context, and population at hand. Specifically, we propose a novel strategy to elicit group-fair performance metrics for multiclass classification problems with multiple sensitive groups that also includes selecting the trade-off between predictive performance and fairness violation. The proposed elicitation strategy requires only relative preference feedback and is robust to both finite sample and feedback noise.

Author Information

Gaurush Hiranandani (UIUC)
Harikrishna Narasimhan (Google Research)
Sanmi Koyejo (Illinois / Google)
Sanmi Koyejo

Sanmi Koyejo an Assistant Professor in the Department of Computer Science at Stanford University. Koyejo also spends time at Google as a part of the Brain team. Koyejo's research interests are in developing the principles and practice of trustworthy machine learning. Additionally, Koyejo focuses on applications to neuroscience and healthcare. Koyejo has been the recipient of several awards, including a best paper award from the conference on uncertainty in artificial intelligence (UAI), a Skip Ellis Early Career Award, and a Sloan Fellowship. Koyejo serves as the president of the Black in AI organization.

More from the Same Authors