Skip to yearly menu bar Skip to main content


Poster

Confidence Calibration of Classifiers with Many Classes

Adrien Le Coz · Stéphane Herbin · Faouzi Adjed


Abstract:

For classification models based on neural networks, the maximum predicted class probability is often used as a confidence score. This score rarely predicts well the probability of making a correct prediction and requires a post-processing calibration step. However, many confidence calibration methods fail for problems with many classes. To address this issue, we transform the problem of calibrating a multiclass classifier into calibrating a single surrogate binary classifier. This approach allows for more efficient use of standard calibration methods. We evaluate our approach on numerous neural networks used for image or text classification and show that it significantly enhances existing calibration methods.

Live content is unavailable. Log in and register to view live content