Skip to yearly menu bar Skip to main content


Tutorial

Agnostic Learning: Algorithms and Theory

Adam T Kalai

Regency E/F

Abstract:

Agnostic Learning (Kearns, Schapire and Sellie '92; Haussler '90) is a branch of Computational Learning Theory in which little or no assumptions are made about the true function being learned. The results are powerful, noise-tolerant binary-classification algorithms. Like Valiant's seminal PAC model, the algorithms are required to be computationally efficient. Unlike the PAC model, it does not assume anything about the "true" labeling function, and hence Agnostic Learning can also be viewed as learning with arbitrary or even adversarial noise.

Recent progress has shown that several classical algorithms can be improved and made agnostic. Moreover, the agnostic perspective sheds light on several key Machine Learning notions, such as Fourier learning, SVMs, Active Learning, Decision Tree Learning, and Boosting. Hence, this tutorial on Agnostic Learning also presents an opportunity to overview and reconsider many beautiful notions from Computational Learning Theory.

Chat is not available.