Timezone: »

 
Poster
Escaping from saddle points on Riemannian manifolds
Yue Sun · Nicolas Flammarion · Maryam Fazel

Thu Dec 12 10:45 AM -- 12:45 PM (PST) @ East Exhibition Hall B + C #199
We consider minimizing a nonconvex, smooth function $f$ on a Riemannian manifold $\mathcal{M}$. We show that a perturbed version of the gradient descent algorithm converges to a second-order stationary point for this problem (and hence is able to escape saddle points on the manifold). While the unconstrained problem is well-studied, our result is the first to prove such a rate for nonconvex, manifold-constrained problems. The rate of convergence depends as $1/\epsilon^2$ on the accuracy $\epsilon$, which matches a rate known only for unconstrained smooth minimization. The convergence rate also has a polynomial dependence on the parameters denoting the curvature of the manifold and the smoothness of the function.

Author Information

Yue Sun (University of Washington)

I'm a fourth year Ph.D. student in Department of Electrical Engineering, University of Washington, Seattle with Prof. Maryam Fazel. I graduated as Bachelor of Engineering from Department of Electronics Engineering, Tsinghua University, China. I'm interested in statistical machine learning, optimization and signal processing. Currently I'm working on first order algorithms solving optimization problem of non-convex function, and regularized control/reinforcement learning problems. I joined Google in June-September, 2019, on online optimization algorithm applied in video coding.

Nicolas Flammarion (EPFL)
Maryam Fazel (University of Washington)

More from the Same Authors