Timezone: »
Boosting is a general method for producing a very accurate classification rule by combining rough and moderately inaccurate "rules of thumb." While rooted in a theoretical framework of machine learning, boosting has been found to perform quite well empirically. This tutorial will introduce the boosting algorithm AdaBoost, and explain the underlying theory of boosting, including explanations that have been given as to why boosting often does not suffer from overfitting, as well as some of the myriad other theoretical points of view that have been taken on this algorithm. Some practical applications and extensions of boosting will also be described.
Author Information
Robert E Schapire (MIcrosoft Research)
Robert Schapire received his ScB in math and computer science from Brown University in 1986, and his SM (1988) and PhD (1991) from MIT under the supervision of Ronald Rivest. After a short post-doc at Harvard, he joined the technical staff at AT&T Labs (formerly AT&T Bell Laboratories) in 1991 where he remained for eleven years. At the end of 2002, he became a Professor of Computer Science at Princeton University. His awards include the 1991 ACM Doctoral Dissertation Award, the 2003 Gödel Prize and the 2004 Kanelakkis Theory and Practice Award (both of the last two with Yoav Freund). His main research interest is in theoretical and applied machine learning.
More from the Same Authors
-
2014 Poster: A Drifting-Games Analysis for Online Learning and Applications to Boosting »
Haipeng Luo · Robert E Schapire -
2010 Poster: A Reduction from Apprenticeship Learning to Classification »
Umar Syed · Robert E Schapire -
2010 Oral: A Theory of Multiclass Boosting »
Indraneel Mukherjee · Robert E Schapire -
2010 Poster: A Theory of Multiclass Boosting »
Indraneel Mukherjee · Robert E Schapire -
2010 Poster: Non-Stochastic Bandit Slate Problems »
Satyen Kale · Lev Reyzin · Robert E Schapire -
2007 Oral: A Multiplicative Weights Algorithm for Apprenticeship Learning »
Umar Syed · Robert E Schapire -
2007 Oral: FilterBoost: Regression and Classification on Large Datasets »
Joseph K Bradley · Robert E Schapire -
2007 Poster: FilterBoost: Regression and Classification on Large Datasets »
Joseph K Bradley · Robert E Schapire -
2007 Poster: A Multiplicative Weights Algorithm for Apprenticeship Learning »
Umar Syed · Robert E Schapire