Skip to yearly menu bar Skip to main content

Affinity Workshop: Women in Machine Learning

When are Local Queries Useful for Robust Learning?

Pascale Gourdeau · Varun Kanade · Marta Kwiatkowska · James Worrell

Abstract: Distributional assumptions have been shown to be necessary for the robust learnability of concept classes when considering exact-in-the-ball robust risk and access to random examples by Gourdeau et al. (2019). In this paper, we study learning models where the learner is given more power through the use of \emph{local} queries and give the first \emph{distribution-free} algorithms that perform robust empirical risk minimization (ERM) for this notion of robustness. The first learning model we consider uses local membership queries (LMQ), where the learner can query the label of points near the training sample. We show that, under the uniform distribution, LMQs do not increase the robustness threshold of conjunctions and any superclass, e.g., decision lists and halfspaces. Faced with this negative result, we introduce the local \emph{equivalence} query oracle, which returns whether the hypothesis and target concept agree in the perturbation region around a point in the training sample, as well as a counterexample if it exists. We show a separation result: on one hand, if the query radius $\lambda$ is strictly smaller than the adversary's perturbation budget $\rho$, then distribution-free robust learning is impossible for a wide variety of concept classes; on the other hand, the setting $\lambda=\rho$ allows us to develop robust ERM algorithms. We then bound the query complexity of these algorithms based on online learning guarantees and further improve these bounds for the special case of conjunctions. We finish by giving robust learning algorithms for halfspaces with margins on both $\boolhc$ and $\mathbb{R}^n$.

Chat is not available.