Skip to yearly menu bar Skip to main content


Poster

Optimal and Adaptive Monteiro-Svaiter Acceleration

Yair Carmon · Danielle Hausler · Arun Jambulapati · Yujia Jin · Aaron Sidford

Hall J (level 1) #815

Keywords: [ Adaptive Methods ] [ parameter-free methods ] [ proximal points ] [ optimal algorithms ] [ Convex Optimization ] [ second-order methods ] [ momentum ] [ conjugate residuals ] [ Newton's method ] [ Oracle complexity ] [ Monteiro-Svaiter acceleration ] [ optimization theory ] [ cubic regularization ]


Abstract: We develop a variant of the Monteiro-Svaiter (MS) acceleration framework that removes the need to solve an expensive implicit equation at every iteration. Consequently, for any p2 we improve the complexity of convex optimization with Lipschitz pth derivative by a logarithmic factor, matching a lower bound. We also introduce an MS subproblem solver that requires no knowledge of problem parameters, and implement it as either a second- or first-order method by solving linear systems or applying MinRes, respectively. On logistic regression problems our method outperforms previous accelerated second-order methods, but under-performs Newton's method; simply iterating our first-order adaptive subproblem solver is competitive with L-BFGS.

Chat is not available.