Timezone: »

 
Poster
Non-convex Finite-Sum Optimization Via SCSG Methods
Lihua Lei · Cheng Ju · Jianbo Chen · Michael Jordan

Mon Dec 04 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #157
We develop a class of algorithms, as variants of the stochastically controlled stochastic gradient (SCSG) methods , for the smooth nonconvex finite-sum optimization problem. Only assuming the smoothness of each component, the complexity of SCSG to reach a stationary point with $E \|\nabla f(x)\|^{2}\le \epsilon$ is $O(\min\{\epsilon^{-5/3}, \epsilon^{-1}n^{2/3}\})$, which strictly outperforms the stochastic gradient descent. Moreover, SCSG is never worse than the state-of-the-art methods based on variance reduction and it significantly outperforms them when the target accuracy is low. A similar acceleration is also achieved when the functions satisfy the Polyak-Lojasiewicz condition. Empirical experiments demonstrate that SCSG outperforms stochastic gradient methods on training multi-layers neural networks in terms of both training and validation loss.

Author Information

Lihua Lei (UC Berkeley)
Cheng Ju (University of California, Berkeley)
Jianbo Chen (University of California, Berkeley)
Michael Jordan (UC Berkeley)

More from the Same Authors