Skip to yearly menu bar Skip to main content


Poster

Reservoir Boosting : Between Online and Offline Ensemble Learning

Leonidas Lefakis · François Fleuret

Harrah's Special Events Center, 2nd Floor

Abstract:

We propose to train an ensemble with the help of a reservoir in which the learning algorithm can store a limited number of samples. This novel approach lies in the area between offline and online ensemble approaches and can be seen either as a restriction of the former or an enhancement of the latter. We identify some basic strategies that can be used to populate this reservoir and present our main contribution, dubbed Greedy Edge Expectation Maximization (GEEM), that maintains the reservoir content in the case of Boosting by viewing the samples through their projections into the weak classifier response space. We propose an efficient algorithmic implementation which makes it tractable in practice, and demonstrate its efficiency experimentally on several compute-vision data-sets, on which it outperforms both online and offline methods in a memory constrained setting.

Live content is unavailable. Log in and register to view live content