Skip to yearly menu bar Skip to main content


Oral
in
Workshop: Workshop on Distribution Shifts: New Frontiers with Foundation Models

OpenOOD v1.5: Enhanced Benchmark for Out-of-Distribution Detection

Jingyang Zhang · Jingkang Yang · Pengyun Wang · Haoqi Wang · Yueqian Lin · Haoran Zhang · Yiyou Sun · Xuefeng Du · Kaiyang Zhou · Wayne Zhang · Yixuan Li · Ziwei Liu · Yiran Chen · Hai Li

Keywords: [ distribution shifts ] [ Novelty Detection ] [ Out-of-distribution Detection ] [ Open-Set Recognition ]

[ ] [ Project Page ]
Fri 15 Dec 11:55 a.m. PST — 12:05 p.m. PST

Abstract:

Out-of-Distribution (OOD) detection is critical for the reliable operation of open-world intelligent systems. Despite the emergence of an increasing number of OOD detection methods, the evaluation inconsistencies present challenges for tracking the progress in this field. OpenOOD v1 initiated the unification of the OOD detection evaluation but faced limitations in scalability and scope. In response, this paper presents OpenOOD v1.5, a significant improvement from its predecessor that ensures accurate and standardized evaluation of OOD detection methodologies at large scale. Notably, OpenOOD v1.5 extends its evaluation capabilities to large-scale datasets (ImageNet) and foundation models (e.g., CLIP and DINOv2), and expands its scope to investigate full-spectrum OOD detection which considers semantic and covariate distribution shifts at the same time. This work also contributes in-depth analysis and insights derived from comprehensive experimental results, thereby enriching the knowledge pool of OOD detection methodologies. With these enhancements, OpenOOD v1.5 aims to drive advancements and offer a more robust and comprehensive evaluation benchmark for OOD detection research.

Chat is not available.