Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Distribution Shifts: New Frontiers with Foundation Models

Beyond Top-Class Agreement: Using Divergences to Forecast Performance under Distribution Shift

Mona Schirmer · Dan Zhang · Eric Nalisnick

Keywords: [ distribution shifts ] [ model disagreement ] [ performance estimation ]


Abstract:

Knowing if a model will generalize to data `in the wild' is crucial for safe deployment. To this end, we study model disagreement notions that consider the full predictive distribution - specifically disagreement based on Hellinger distance and Kullback–Leibler divergence. We find that divergence-based scores provide better test error estimates and detection rates on out-of-distribution data compared to their top 1 counterparts. Experiments involve standard vision and foundation models.

Chat is not available.