Poster
Online Gradient Boosting
Alina Beygelzimer · Elad Hazan · Satyen Kale · Haipeng Luo

Mon Dec 7th 07:00 -- 11:59 PM @ 210 C #93 #None

We extend the theory of boosting for regression problems to the online learning setting. Generalizing from the batch setting for boosting, the notion of a weak learning algorithm is modeled as an online learning algorithm with linear loss functions that competes with a base class of regression functions, while a strong learning algorithm is an online learning algorithm with smooth convex loss functions that competes with a larger class of regression functions. Our main result is an online gradient boosting algorithm which converts a weak online learning algorithm into a strong one where the larger class of functions is the linear span of the base class. We also give a simpler boosting algorithm that converts a weak online learning algorithm into a strong one where the larger class of functions is the convex hull of the base class, and prove its optimality.

Author Information

Alina Beygelzimer (Yahoo Labs)
Elad Hazan (Princeton University)
Satyen Kale (Yahoo Labs)
Haipeng Luo (Princeton University)

More from the Same Authors