Skip to yearly menu bar Skip to main content


Poster

Parameter-Free Online Learning via Model Selection

Dylan J Foster · Satyen Kale · Mehryar Mohri · Karthik Sridharan

Pacific Ballroom #63

Keywords: [ Online Learning ] [ Learning Theory ]


Abstract: We introduce an efficient algorithmic framework for model selection in online learning, or parameter-free online learning. Our algorithms satisfy oracle inequalities in the adversarial online learning setting. Unlike previous work in this area that has focused on specific, highly structured function classes, such as nested balls in a Hilbert space, we propose a generic meta-algorithm framework that achieves oracle inequalities under minimal structural assumptions: we give the first computationally efficient algorithms that work in arbitrary Banach spaces under mild smoothness assumptions --- previous results only applied to Hilbert spaces. We further derive new oracle inequalities for various matrix classes, non-nested convex sets, and $\R^{d}$ with generic regularizers. Finally, we generalize these results by providing oracle inequalities for arbitrary non-linear classes in the contextual learning model; in particular, we give new algorithms for learning with multiple kernels. These results are all derived through a unified meta-algorithm scheme using a novel ``multi-scale'' algorithm for prediction with expert advice based on random playout, which may be of independent interest.

Live content is unavailable. Log in and register to view live content