Poster

Characterizing Out-of-Distribution Error via Optimal Transport

Yuzhe Lu · Yilong Qin · Runtian Zhai · Andrew Shen · Ketong Chen · Zhenlin Wang · Soheil Kolouri · Simon Stepputtis · Joseph Campbell · Katia Sycara

Great Hall & Hall B1+B2 (level 1) #725
[ ]
Wed 13 Dec 3 p.m. PST — 5 p.m. PST

Abstract:

Out-of-distribution (OOD) data poses serious challenges in deployed machine learning models,so methods of predicting a model's performance on OOD data without labels are important for machine learning safety.While a number of methods have been proposed by prior work, they often underestimate the actual error, sometimes by a large margin, which greatly impacts their applicability to real tasks. In this work, we identify pseudo-label shift, or the difference between the predicted and true OOD label distributions, as a key indicator of this underestimation. Based on this observation, we introduce a novel method for estimating model performance by leveraging optimal transport theory, Confidence Optimal Transport (COT), and show that it provably provides more robust error estimates in the presence of pseudo-label shift. Additionally, we introduce an empirically-motivated variant of COT, Confidence Optimal Transport with Thresholding (COTT), which applies thresholding to the individual transport costs and further improves the accuracy of COT's error estimates. We evaluate COT and COTT on a variety of standard benchmarks that induce various types of distribution shift -- synthetic, novel subpopulation, and natural -- and show that our approaches significantly outperform existing state-of-the-art methods with up to 3x lower prediction errors.

Chat is not available.