Timezone: »

Potential-Based Agnostic Boosting
Adam Kalai · Varun Kanade

Wed Dec 09 07:00 PM -- 11:59 PM (PST) @

We prove strong noise-tolerance properties of a potential-based boosting algorithm, similar to MadaBoost (Domingo and Watanabe, 2000) and SmoothBoost (Servedio, 2003). Our analysis is in the agnostic framework of Kearns, Schapire and Sellie (1994), giving polynomial-time guarantees in presence of arbitrary noise. A remarkable feature of our algorithm is that it can be implemented without reweighting examples, by randomly relabeling them instead. Our boosting theorem gives, as easy corollaries, alternative derivations of two recent non-trivial results in computational learning theory: agnostically learning decision trees (Gopalan et al, 2008) and agnostically learning halfspaces (Kalai et al, 2005). Experiments suggest that the algorithm performs similarly to Madaboost.

Author Information

Adam Kalai (Microsoft Research New England (-(-_(-_-)_-)-))
Varun Kanade (UC Berkeley)

More from the Same Authors