Timezone: »

Fast Rates for Exp-concave Empirical Risk Minimization
Tomer Koren · Kfir Y. Levy

Thu Dec 10 08:00 AM -- 12:00 PM (PST) @ 210 C #99
We consider Empirical Risk Minimization (ERM) in the context of stochastic optimization with exp-concave and smooth losses---a general optimization framework that captures several important learning problems including linear and logistic regression, learning SVMs with the squared hinge-loss, portfolio selection and more. In this setting, we establish the first evidence that ERM is able to attain fast generalization rates, and show that the expected loss of the ERM solution in $d$ dimensions converges to the optimal expected loss in a rate of $d/n$. This rate matches existing lower bounds up to constants and improves by a $\log{n}$ factor upon the state-of-the-art, which is only known to be attained by an online-to-batch conversion of computationally expensive online algorithms.

Author Information

Tomer Koren (Technion)
Kfir Y. Levy (Technion)

More from the Same Authors