Spotlight
Parameter-Free Online Learning via Model Selection
Dylan J Foster · Satyen Kale · Mehryar Mohri · Karthik Sridharan

Tue Dec 5th 05:40 -- 05:45 PM @ Hall C
We introduce a new framework for deriving efficient algorithms that obtain model selection oracle inequalities in the adversarial online learning setting, also sometimes described as parameter-free online learning. While work in this area has focused on specific, highly-structured, function classes, such as nested balls in a Hilbert space, we eschew this approach and propose a generic meta-algorithm framework which achieves oracle inequalities under minimal structural assumptions. This allows us to derive new computationally efficient algorithms with oracle bounds for a wide range of settings where such results were previously unavailable. We give the first computationally efficient algorithms which work in arbitrary Banach spaces under mild smoothness assumptions --- previous results only applied to the Hilbert case. We further derive new oracle inequalities for various matrix classes, non-nested convex sets, and $\R^{d}$ with generic regularizers. Finally, we generalize further by providing oracle inequalities for arbitrary non-linear classes in the contextual learning model; in particular, we give new algorithms for learning with multiple kernels. These results are all derived through a unified meta-algorithm scheme based on a novel "multi-scale" algorithm for prediction with expert advice based on random playout, which may be of independent interest.

#### Author Information

##### Mehryar Mohri (Courant Institute and Google)

Mehryar Mohri is a Professor of Computer Science and Mathematics at the Courant Institute of Mathematical Sciences and a Research Consultant at Google. Prior to these positions, he spent about ten years at AT&T Bell Labs, later AT&T Labs-Research, where he served for several years as a Department Head and a Technology Leader. His research interests cover a number of different areas: primarily machine learning, algorithms and theory, automata theory, speech processing, natural language processing, and also computational biology. His research in learning theory and algorithms has been used in a variety of applications. His work on automata theory and algorithms has served as the foundation for several applications in language processing, with several of his algorithms used in virtually all spoken-dialog and speech recognitions systems used in the United States. He has co-authored several software libraries widely used in research and academic labs. He is also co-author of the machine learning textbook Foundations of Machine Learning used in graduate courses on machine learning in several universities and corporate research laboratories.