Timezone: »

Non-Convex SGD Learns Halfspaces with Adversarial Label Noise
Ilias Diakonikolas · Vasilis Kontonis · Christos Tzamos · Nikos Zarifis

Tue Dec 08 09:00 AM -- 11:00 AM (PST) @ Poster Session 1 #228
We study the problem of agnostically learning homogeneous halfspaces in the distribution-specific PAC model. For a broad family of structured distributions, including log-concave distributions, we show that non-convex SGD efficiently converges to a solution with misclassification error $O(\opt)+\eps$, where $\opt$ is the misclassification error of the best-fitting halfspace. In sharp contrast, we show that optimizing any convex surrogate inherently leads to misclassification error of $\omega(\opt)$, even under Gaussian marginals.

Author Information

Ilias Diakonikolas (UW Madison)
Vasilis Kontonis (University of Wisconsin-Madison)
Christos Tzamos (UW Madison)
Nikos Zarifis (University of Wisconsin-Madison)

More from the Same Authors