Adaptive acceleration without strong convexity priors or restarts
Joao V. Cavalcanti · Laurent Lessard · Ashia Wilson
Abstract
In this paper, we propose a parameter-free algorithm for smooth and strongly convex objective problems called NAG-free. To our knowledge, NAG-free is the first adaptive algorithm capable of directly estimating the strong convexity parameter without priors or resorting to restart schemes. We prove that NAG-free converges globally at least as fast gradient descent, and achieves accelerated convergence locally around the minimum if the Hessian is locally smooth at the minimum and other mild additional assumptions hold. We present real-world experiments in which NAG-free is competitive with restart schemes and adapts to better local curvature conditions.
Chat is not available.
Successful Page Load