Timezone: »
Oral
Efficient active learning of sparse halfspaces with arbitrary bounded noise
Chicheng Zhang · Jie Shen · Pranjal Awasthi
Tue Dec 08 06:15 AM  06:30 AM (PST) @ Orals & Spotlights: Learning Theory
We study active learning of homogeneous $s$sparse halfspaces in $\mathbb{R}^d$ under the setting where the unlabeled data distribution is isotropic logconcave and each label is flipped with probability at most $\eta$ for a parameter $\eta \in \big[0, \frac12\big)$, known as the bounded noise. Even in the presence of mild label noise, i.e. $\eta$ is a small constant, this is a challenging problem and only recently have label complexity bounds of the form $\tilde{O}(s \cdot polylog(d, \frac{1}{\epsilon}))$ been established in [Zhang 2018] for computationally efficient algorithms. In contrast, under high levels of label noise, the label complexity bounds achieved by computationally efficient algorithms are much worse: the best known result [Awasthi et al. 2016] provides a computationally efficient algorithm with label complexity $\tilde{O}((s ln d/\epsilon)^{poly(1/(12\eta))})$, which is labelefficient only when the noise rate $\eta$ is a fixed constant. In this work, we substantially improve on it by designing a polynomial time algorithm for active learning of $s$sparse halfspaces, with a label complexity of $\tilde{O}\big(\frac{s}{(12\eta)^4} polylog (d, \frac 1 \epsilon) \big)$. This is the first efficient algorithm with label complexity polynomial in $\frac{1}{12\eta}$ in this setting, which is labelefficient even for $\eta$ arbitrarily close to $\frac12$. Our active learning algorithm and its theoretical guarantees also immediately translate to new stateoftheart label and sample complexity results for fulldimensional active and passive halfspace learning under arbitrary bounded noise.
Author Information
Chicheng Zhang (University of Arizona)
Jie Shen (Stevens Institute of Technology)
Pranjal Awasthi (Google/Rutgers University)
Related Events (a corresponding poster, oral, or spotlight)

2020 Poster: Efficient active learning of sparse halfspaces with arbitrary bounded noise »
Tue. Dec 8th 05:00  07:00 PM Room Poster Session 1 #516
More from the Same Authors

2022 Poster: PopArt: Efficient Sparse Regression and Experimental Design for Optimal Sparse Linear Bandits »
Kyoungseok Jang · Chicheng Zhang · KwangSung Jun 
2022 Poster: On Efficient Online Imitation Learning via Classification »
Yichen Li · Chicheng Zhang 
2022 Poster: ListDecodable Sparse Mean Estimation »
Shiwei Zeng · Jie Shen 
2021 Poster: Provably efficient multitask reinforcement learning with model transfer »
Chicheng Zhang · Zhi Wang 
2020 Poster: Crush Optimism with Pessimism: Structured Bandits Beyond Asymptotic Optimality »
KwangSung Jun · Chicheng Zhang 
2020 Poster: Efficient Contextual Bandits with Continuous Actions »
Maryam Majzoubi · Chicheng Zhang · Rajan Chari · Akshay Krishnamurthy · John Langford · Aleksandrs Slivkins 
2020 Poster: Adversarial robustness via robust low rank representations »
Pranjal Awasthi · Himanshu Jain · Ankit Singh Rawat · Aravindan Vijayaraghavan 
2020 Poster: PACBayes Learning Bounds for SampleDependent Priors »
Pranjal Awasthi · Satyen Kale · Stefani Karp · Mehryar Mohri 
2019 Poster: On Robustness to Adversarial Examples and Polynomial Optimization »
Pranjal Awasthi · Abhratanu Dutta · Aravindan Vijayaraghavan 
2017 Poster: Partial Hard Thresholding: Towards A Principled Analysis of Support Recovery »
Jie Shen · Ping Li 
2014 Poster: Online Optimization for MaxNorm Regularization »
Jie Shen · Huan Xu · Ping Li