Timezone: »
Oral
Efficient active learning of sparse halfspaces with arbitrary bounded noise
Chicheng Zhang · Jie Shen · Pranjal Awasthi
Tue Dec 08 06:15 AM -- 06:30 AM (PST) @ Orals & Spotlights: Learning Theory
We study active learning of homogeneous $s$-sparse halfspaces in $\mathbb{R}^d$ under the setting where the unlabeled data distribution is isotropic log-concave and each label is flipped with probability at most $\eta$ for a parameter $\eta \in \big[0, \frac12\big)$, known as the bounded noise. Even in the presence of mild label noise, i.e. $\eta$ is a small constant, this is a challenging problem and only recently have label complexity bounds of the form $\tilde{O}(s \cdot polylog(d, \frac{1}{\epsilon}))$ been established in [Zhang 2018] for computationally efficient algorithms. In contrast, under high levels of label noise, the label complexity bounds achieved by computationally efficient algorithms are much worse: the best known result [Awasthi et al. 2016] provides a computationally efficient algorithm with label complexity $\tilde{O}((s ln d/\epsilon)^{poly(1/(1-2\eta))})$, which is label-efficient only when the noise rate $\eta$ is a fixed constant. In this work, we substantially improve on it by designing a polynomial time algorithm for active learning of $s$-sparse halfspaces, with a label complexity of $\tilde{O}\big(\frac{s}{(1-2\eta)^4} polylog (d, \frac 1 \epsilon) \big)$. This is the first efficient algorithm with label complexity polynomial in $\frac{1}{1-2\eta}$ in this setting, which is label-efficient even for $\eta$ arbitrarily close to $\frac12$. Our active learning algorithm and its theoretical guarantees also immediately translate to new state-of-the-art label and sample complexity results for full-dimensional active and passive halfspace learning under arbitrary bounded noise.
Author Information
Chicheng Zhang (University of Arizona)
Jie Shen (Stevens Institute of Technology)
Pranjal Awasthi (Google/Rutgers University)
Related Events (a corresponding poster, oral, or spotlight)
-
2020 Poster: Efficient active learning of sparse halfspaces with arbitrary bounded noise »
Tue Dec 8th 05:00 -- 07:00 PM Room Poster Session 1
More from the Same Authors
-
2020 Poster: Crush Optimism with Pessimism: Structured Bandits Beyond Asymptotic Optimality »
Kwang-Sung Jun · Chicheng Zhang -
2020 Poster: Efficient Contextual Bandits with Continuous Actions »
Maryam Majzoubi · Chicheng Zhang · Rajan Chari · Akshay Krishnamurthy · John Langford · Aleksandrs Slivkins -
2020 Poster: Adversarial robustness via robust low rank representations »
Pranjal Awasthi · Himanshu Jain · Ankit Singh Rawat · Aravindan Vijayaraghavan -
2020 Poster: PAC-Bayes Learning Bounds for Sample-Dependent Priors »
Pranjal Awasthi · Satyen Kale · Stefani Karp · Mehryar Mohri -
2019 Poster: On Robustness to Adversarial Examples and Polynomial Optimization »
Pranjal Awasthi · Abhratanu Dutta · Aravindan Vijayaraghavan -
2017 Poster: Partial Hard Thresholding: Towards A Principled Analysis of Support Recovery »
Jie Shen · Ping Li -
2014 Poster: Online Optimization for Max-Norm Regularization »
Jie Shen · Huan Xu · Ping Li