Spotlight Poster
Learning Noisy Halfspaces with a Margin: Massart is No Harder than Random
Gautam Chandrasekaran · Vasilis Kontonis · Konstantinos Stavropoulos · Kevin Tian
West Ballroom A-D #6508
Abstract:
We study the problem of PAC learning -margin halfspaces with Massart noise. We propose a simple proper learning algorithm, the Perspectron, that has sample complexity and achieves classification error at most where is the Massart noise rate. Prior works (DGT19, CKMY20) came with worse sample complexity guarantees (in both and ) or could only handle random classification noise (DDKWZ23,KITBMV23)--- a much milder noise assumption. We also show that our results extend to the more challenging setting of learning generalized linear models with a known link function under Massart noise, achieving a similar sample complexity to the halfspace case. This significantly improves upon the prior state-of-the-art in this setting due to CKMY20, who introduced this model.
Chat is not available.