Non-Convex SGD Learns Halfspaces with Adversarial Label Noise
Ilias Diakonikolas · Vasilis Kontonis · Christos Tzamos · Nikos Zarifis
2020 Poster
Abstract
We study the problem of agnostically learning homogeneous halfspaces in the distribution-specific PAC model.
For a broad family of structured distributions, including log-concave distributions, we show that non-convex SGD
efficiently converges to a solution with misclassification error $O(\opt)+\eps$, where $\opt$ is the misclassification
error of the best-fitting halfspace. In sharp contrast, we show that optimizing any convex surrogate inherently
leads to misclassification error of $\omega(\opt)$, even under Gaussian marginals.
Video
Chat is not available.
Successful Page Load