Timezone: »

Beating SGD: Learning SVMs in Sublinear Time
Elad Hazan · Tomer Koren · Nati Srebro

Wed Dec 14 08:45 AM -- 02:59 PM (PST) @ None #None

We present an optimization approach for linear SVMs based on a stochastic primal-dual approach, where the primal step is akin to an importance-weighted SGD, and the dual step is a stochastic update on the importance weights. This yields an optimization method with a sublinear dependence on the training set size, and the first method for learning linear SVMs with runtime less then the size of the training set required for learning!

Author Information

Elad Hazan (Technion)
Tomer Koren (Technion)
Nati Srebro (TTI-Chicago)

More from the Same Authors