Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Distribution Shifts: Connecting Methods and Applications

Is Unsupervised Performance Estimation Impossible When Both Covariates and Labels shift?

Lingjiao Chen · Matei Zaharia · James Zou


Abstract:

Accurately estimating and explaining an ML model’s performance on new datasets is increasingly critical in reliable ML model deployment. With no labels on the new datasets, performance estimation paradigms often assume either covariate shift or label shift, and thus lead to poor estimation accuracy when the assumptions are broken. Is unsupervised performance monitoring really impossible when both covariates and labels shift? In this paper, we give a negative answer. To do so, we introduce Sparse Joint Shift (SJS), a new distribution shift model considering the shift of labels and a few features. We characterize the mathematical conditions under which SJS is identifiable. This shows that unsupervised performance monitoring is indeed feasible when a few features and labels shift. In addition, we propose SEES, an algorithmic framework for performance estimation under SJS. Preliminary experiments show the superior estimation performance of SEES over existing paradigms. This opens the door to tackling the joint shift of both covariates and labels without observing new datasets’ labels.

Chat is not available.