Skip to yearly menu bar Skip to main content


Poster

Near-Optimal Bounds for Learning Gaussian Halfspaces with Random Classification Noise

Ilias Diakonikolas · Jelena Diakonikolas · Daniel Kane · Puqian Wang · Nikos Zarifis

Great Hall & Hall B1+B2 (level 1) #1722

Abstract: We study the problem of learning general (i.e., not necessarily homogeneous) halfspaces with Random Classification Noise under the Gaussian distribution. We establish nearly-matching algorithmic and Statistical Query (SQ) lower bound results revealing a surprising information-computation gap for this basic problem. Specifically, the sample complexity of this learning problem is Θ~(d/ϵ), where d is the dimension and ϵ is the excess error. Our positive result is a computationally efficient learning algorithm with sample complexityO~(d/ϵ+d/max(p,ϵ))2), where p quantifies the bias of the target halfspace. On the lower bound side, we show that any efficient SQ algorithm (or low-degree test)for the problem requires sample complexity at least Ω(d1/2/(max(p,ϵ))2). Our lower bound suggests that this quadratic dependence on 1/ϵ is inherent for efficient algorithms.

Chat is not available.