Skip to yearly menu bar Skip to main content


Poster

Third-order Smoothness Helps: Faster Stochastic Optimization Algorithms for Finding Local Minima

Yaodong Yu · Pan Xu · Quanquan Gu

Room 210 #18

Keywords: [ Non-Convex Optimization ]


Abstract: We propose stochastic optimization algorithms that can find local minima faster than existing algorithms for nonconvex optimization problems, by exploiting the third-order smoothness to escape non-degenerate saddle points more efficiently. More specifically, the proposed algorithm only needs O~(ϵ10/3) stochastic gradient evaluations to converge to an approximate local minimum x, which satisfies f(x)2ϵ and λmin(2f(x))ϵ in unconstrained stochastic optimization, where O~() hides logarithm polynomial terms and constants. This improves upon the O~(ϵ7/2) gradient complexity achieved by the state-of-the-art stochastic local minima finding algorithms by a factor of O~(ϵ1/6). Experiments on two nonconvex optimization problems demonstrate the effectiveness of our algorithm and corroborate our theory.

Live content is unavailable. Log in and register to view live content