Timezone: »
Multiclass extensions of the support vector machine (SVM) have been formulated in a variety of ways. A recent empirical comparison of nine such formulations [Doǧan et al. 2016] recommends the variant proposed by Weston and Watkins (WW), despite the fact that the WW-hinge loss is not calibrated with respect to the 0-1 loss. In this work we introduce a novel discrete loss function for multiclass classification, the ordered partition loss, and prove that the WW-hinge loss is calibrated with respect to this loss. We also argue that the ordered partition loss is minimally emblematic among discrete losses satisfying this property. Finally, we apply our theory to justify the empirical observation made by Doǧan et al that the WW-SVM can work well even under massive label noise, a challenging setting for multiclass SVMs.
Author Information
Yutong Wang (University of Michigan)
Clayton Scott (University of Michigan)
More from the Same Authors
-
2020 Poster: Learning from Label Proportions: A Mutual Contamination Framework »
Clayton Scott · Jianxin Zhang -
2020 Poster: Consistent Estimation of Identifiable Nonparametric Mixture Models from Grouped Observations »
Alexander Ritchie · Robert Vandermeulen · Clayton Scott -
2017 Poster: Multi-Task Learning for Contextual Bandits »
Aniket Anand Deshmukh · Urun Dogan · Clay Scott -
2014 Poster: Robust Kernel Density Estimation by Scaling and Projection in Hilbert Space »
Robert A Vandermeulen · Clayton Scott -
2011 Poster: Generalizing from Several Related Classification Tasks to a New Unlabeled Sample »
Gilles Blanchard · Gyemin Lee · Clayton Scott -
2010 Poster: Extensions of Generalized Binary Search to Group Identification and Exponential Costs »
Gowtham Bellala · Suresh Bhavnani · Clayton Scott -
2008 Poster: Performance analysis for L_2 kernel classification »
JooSeuk Kim · Clayton Scott -
2008 Spotlight: Performance analysis for L_2 kernel classification »
JooSeuk Kim · Clayton Scott