Skip to yearly menu bar Skip to main content


Spotlight Poster

Accelerated Quasi-Newton Proximal Extragradient: Faster Rate for Smooth Convex Optimization

Ruichen Jiang · Aryan Mokhtari

Great Hall & Hall B1+B2 (level 1) #1213

Abstract: In this paper, we propose an accelerated quasi-Newton proximal extragradient method for solving unconstrained smooth convex optimization problems. With access only to the gradients of the objective, we prove that our method can achieve a convergence rate of O(min1k2,dlogkk2.5), where d is the problem dimension and k is the number of iterations. In particular, in the regime where k=O(d), our method matches the _optimal rate_ of O(1k2) by Nesterov's accelerated gradient (NAG). Moreover, in the the regime where k=Ω(dlogd), it outperforms NAG and converges at a _faster rate_ of O(dlogkk2.5). To the best of our knowledge, this result is the first to demonstrate a provable gain for a quasi-Newton-type method over NAG in the convex setting. To achieve such results, we build our method on a recent variant of the Monteiro-Svaiter acceleration framework and adopt an online learning perspective to update the Hessian approximation matrices, in which we relate the convergence rate of our method to the dynamic regret of a specific online convex optimization problem in the space of matrices.

Chat is not available.