Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2022: Optimization for Machine Learning

Sufficient conditions for non-asymptotic convergence of Riemannian optimization methods

Vishwak Srinivasan · Ashia Wilson


Abstract:

Motivated by energy based analyses for descent methods in the Euclidean setting, we investigate a generalisation of such analyses for descent methods over Riemannian manifolds. In doing so, we find that it is possible to derive curvature-free guarantees for such descent methods. This also enables us to give the first known guarantees for a Riemannian cubic-regularised Newton algorithm over g-convex functions, which extends the guarantees by Agarwal et al for an adaptive Riemannian cubic-regularised Newton algorithm over general non-convex functions. This analysis motivates us to study acceleration of Riemannian gradient descent in the g-convex setting, and we improve on an existing result by Alimisis et al, albeit with a curvature-dependent rate. Finally, extending the analysis by Ahn and Sra, we attempt to provide some sufficient conditions for the acceleration of Riemannian descent methods in the strongly geodesically convex setting.

Chat is not available.