Skip to yearly menu bar Skip to main content


Poster

Label Noise: Ignorance Is Bliss

Yilun Zhu · Jianxin Zhang · Aditya Gangrade · Clay Scott

[ ]
Fri 13 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

We establish a new theoretical framework for learning under multi-class, instance-dependent label noise. At the heart of our framework is the concept of \emph{relative signal strength}, which is a point-wise measure of noisiness. We use relative signal strength to establish matching upper and lower bounds for excess risk, and identify precise conditions that guarantee Bayes-consistency of a classifier learned from noisy labels. Our theoretical findings reveal a surprising result: the extremely simple \emph{Noise Ignorant Empirical Risk Minimization (NI-ERM)} principle, which conducts empirical risk minimization as if no label noise exists, is minimax optimal. Finally, we translate these theoretical insights into practice: by using NI-ERM to fit a linear classifier on top of a self-supervised feature extractor, we achieve state-of-the-art performance on the CIFAR-N data challenge.

Live content is unavailable. Log in and register to view live content