Poster
Escape saddle points by a simple gradient-descent based algorithm
Chenyi Zhang · Tongyang Li
Virtual
Keywords: [ Optimization ]
Abstract:
Escaping saddle points is a central research topic in nonconvex optimization. In this paper, we propose a simple gradient-based algorithm such that for a smooth function , it outputs an -approximate second-order stationary point in iterations. Compared to the previous state-of-the-art algorithms by Jin et al. with or iterations, our algorithm is polynomially better in terms of and matches their complexities in terms of . For the stochastic setting, our algorithm outputs an -approximate second-order stationary point in iterations. Technically, our main contribution is an idea of implementing a robust Hessian power method using only gradients, which can find negative curvature near saddle points and achieve the polynomial speedup in compared to the perturbed gradient descent methods. Finally, we also perform numerical experiments that support our results.
Chat is not available.