Spotlight Poster
Accelerated Quasi-Newton Proximal Extragradient: Faster Rate for Smooth Convex Optimization
Ruichen Jiang · Aryan Mokhtari
Great Hall & Hall B1+B2 (level 1) #1213
Abstract:
In this paper, we propose an accelerated quasi-Newton proximal extragradient method for solving unconstrained smooth convex optimization problems. With access only to the gradients of the objective, we prove that our method can achieve a convergence rate of , where is the problem dimension and is the number of iterations. In particular, in the regime where , our method matches the _optimal rate_ of by Nesterov's accelerated gradient (NAG). Moreover, in the the regime where , it outperforms NAG and converges at a _faster rate_ of . To the best of our knowledge, this result is the first to demonstrate a provable gain for a quasi-Newton-type method over NAG in the convex setting. To achieve such results, we build our method on a recent variant of the Monteiro-Svaiter acceleration framework and adopt an online learning perspective to update the Hessian approximation matrices, in which we relate the convergence rate of our method to the dynamic regret of a specific online convex optimization problem in the space of matrices.
Chat is not available.