Timezone: »

On Structured Prediction Theory with Calibrated Convex Surrogate Losses.
Simon Lacoste-Julien

Fri Dec 08 09:00 AM -- 09:30 AM (PST) @ None
Event URL: http://www.di.ens.fr/~slacoste/ »

We provide novel theoretical insights on structured prediction in the context of efficient convex surrogate loss minimization with consistency guarantees. For any task loss, we construct a convex surrogate that can be optimized via stochastic gradient descent and we prove tight bounds on the so-called "calibration function" relating the excess surrogate risk to the actual risk. In contrast to prior related work, we carefully monitor the effect of the exponential number of classes in the learning guarantees as well as on the optimization complexity. As an interesting consequence, we formalize the intuition that some task losses make learning harder than others, and that the classical 0-1 loss is ill-suited for general structured prediction.

This (https://arxiv.org/abs/1703.02403) is joint work with Anton Osokin and Francis Bach.

Author Information

Simon Lacoste-Julien (Mila, Université de Montréal)

Simon Lacoste-Julien is an associate professor at Mila and DIRO from Université de Montréal, and Canada CIFAR AI Chair holder. He also heads part time the SAIT AI Lab Montreal from Samsung. His research interests are machine learning and applied math, with applications in related fields like computer vision and natural language processing. He obtained a B.Sc. in math., physics and computer science from McGill, a PhD in computer science from UC Berkeley and a post-doc from the University of Cambridge. He spent a few years as a research faculty at INRIA and École normale supérieure in Paris before coming back to his roots in Montreal in 2016 to answer the call from Yoshua Bengio in growing the Montreal AI ecosystem.

More from the Same Authors