Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2023: Optimization for Machine Learning

Level Set Teleportation: the Good, the Bad, and the Ugly

Aaron Mishkin · Alberto Bietti · Robert Gower


Abstract:

We study level set teleportation, an optimization sub-routine which seeks to accelerate gradient methods by maximizing the gradient along the level-curve of parameters with the same objective value. Since the descent lemma implies that gradient descent decreases the objective proportional to the squared norm of the gradient, level-set teleportation maximizes the one-step progress guarantee. We prove that level-set teleportation neither improves nor worsens the convergence of gradient descent for strongly convex functions, while for convex functions teleportation can move iterates arbitrarily far from the global minimizers. To evaluate teleportation in practice, we develop a projected-gradient-type method requiring only Hessian-vector products. We use this method to show that initializing gradient methods using level set teleportation slightly under-performs standard initializations for both convex and non-convex optimization problems. As a result, we report a mixed picture: teleportation can be efficiently evaluated, but it appears to offer marginal gains.

Chat is not available.