Timezone: »

 
Poster
Early stopping for kernel boosting algorithms: A general analysis with localized complexities
Yuting Wei · Fanny Yang · Martin Wainwright

Wed Dec 06 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #215 #None
Early stopping of iterative algorithms is a widely-used form of regularization in statistical learning, commonly used in conjunction with boosting and related gradient-type algorithms. Although consistency results have been established in some settings, such estimators are less well-understood than their analogues based on penalized regularization. In this paper, for a relatively broad class of loss functions and boosting algorithms (including $L^2$-boost, LogitBoost and AdaBoost, among others), we connect the performance of a stopped iterate to the localized Rademacher/Gaussian complexity of the associated function class. This connection allows us to show that local fixed point analysis, now standard in the analysis of penalized estimators, can be used to derive optimal stopping rules. We derive such stopping rules in detail for various kernel classes, and illustrate the correspondence of our theory with practice for Sobolev kernel classes.

Author Information

Yuting Wei (University of California, Berkeley)
Fanny Yang (ETH Zurich)
Martin Wainwright (UC Berkeley)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors