Timezone: »

On the Efficient Minimization of Classification Calibrated Surrogates
Richard Nock · Frank NIELSEN

Tue Dec 09 05:26 PM -- 05:27 PM (PST) @ None

Bartlett et al (2006) recently proved that a ground condition for convex surrogates, classification calibration, ties up the minimization of the surrogates and classification risks, and left as an important problem the algorithmic questions about the minimization of these surrogates. In this paper, we propose an algorithm which provably minimizes any classification calibrated surrogate strictly convex and differentiable --- a set whose losses span the exponential, logistic and squared losses ---, with boosting-type guaranteed convergence rates under a weak learning assumption. A particular subclass of these surrogates, that we call balanced convex surrogates, has a key rationale that ties it to maximum likelihood estimation, zero-sum games and the set of losses that satisfy some of the most common requirements for losses in supervised learning. We report experiments on more than 50 readily available domains of 11 flavors of the algorithm, that shed light on new surrogates, and the potential of data dependent strategies to tune surrogates.

Author Information

Richard Nock (CEREGMIA - Univ. Antilles-Guyane)
Frank NIELSEN (Sony Computer Science Laboratories)

Related Events (a corresponding poster, oral, or spotlight)