Skip to yearly menu bar Skip to main content


Poster

Optimal and Adaptive Monteiro-Svaiter Acceleration

Yair Carmon · Danielle Hausler · Arun Jambulapati · Yujia Jin · Aaron Sidford

Hall J (level 1) #815

Keywords: [ cubic regularization ] [ optimization theory ] [ Monteiro-Svaiter acceleration ] [ Oracle complexity ] [ Newton's method ] [ conjugate residuals ] [ momentum ] [ second-order methods ] [ Convex Optimization ] [ optimal algorithms ] [ proximal points ] [ parameter-free methods ] [ Adaptive Methods ]


Abstract: We develop a variant of the Monteiro-Svaiter (MS) acceleration framework that removes the need to solve an expensive implicit equation at every iteration. Consequently, for any $p\ge 2$ we improve the complexity of convex optimization with Lipschitz $p$th derivative by a logarithmic factor, matching a lower bound. We also introduce an MS subproblem solver that requires no knowledge of problem parameters, and implement it as either a second- or first-order method by solving linear systems or applying MinRes, respectively. On logistic regression problems our method outperforms previous accelerated second-order methods, but under-performs Newton's method; simply iterating our first-order adaptive subproblem solver is competitive with L-BFGS.

Chat is not available.