Timezone: »
Poster
Finding SecondOrder Stationary Points in NonconvexStronglyConcave Minimax Optimization
Luo Luo · Yujun Li · Cheng Chen
@
We study the smooth minimax optimization problem $\min_{\bf x}\max_{\bf y} f({\bf x},{\bf y})$, where $f$ is $\ell$smooth, stronglyconcave in ${\bf y}$ but possibly nonconvex in ${\bf x}$. Most of existing works focus on finding the firstorder stationary point of the function $f({\bf x},{\bf y})$ or its primal function $P({\bf x})\triangleq \max_{\bf y} f({\bf x},{\bf y})$, but few of them focus on achieving the secondorder stationary point, which is essential to nonconvex problems. In this paper, we propose a novel approach for minimax optimization, called Minimax Cubic Newton (MCN), which could find an ${\mathcal O}\left(\varepsilon,\kappa^{1.5}\sqrt{\rho\varepsilon}\right)$secondorder stationary point of $P({\bf x})$ with calling ${\mathcal O}\left(\kappa^{1.5}\sqrt{\rho}\varepsilon^{1.5}\right)$ times of secondorder oracles and $\tilde{\mathcal O}\left(\kappa^{2}\sqrt{\rho}\varepsilon^{1.5}\right)$ times of firstorder oracles, where $\kappa$ is the condition number and $\rho$ is the Lipschitz continuous constant for the Hessian of $f({\bf x},{\bf y})$. In addition, we propose an inexact variant of MCN for highdimensional problems to avoid calling the expensive secondorder oracles. Instead, our method solves the cubic subproblem inexactly via gradient descent and matrix Chebyshev expansion. This strategy still obtains the desired approximate secondorder stationary point with high probability but only requires $\tilde{\mathcal O}\left(\kappa^{1.5}\ell\varepsilon^{2}\right)$ Hessianvector oracle calls and $\tilde{\mathcal O}\left(\kappa^{2}\sqrt{\rho}\varepsilon^{1.5}\right)$ firstorder oracle calls. To the best of our knowledge, this is the first work that considers the nonasymptotic convergence behavior of finding secondorder stationary points for minimax problems without the convexconcave assumptions.
Author Information
Luo Luo (Fudan University)
Yujun Li (Huawei Technologies Ltd.)
Cheng Chen (Nanyang Technological University)
More from the Same Authors

2022 Poster: QuasiNewton Methods for Saddle Point Problems »
Chengchang Liu · Luo Luo 
2022 Poster: Faster Stochastic Algorithms for Minimax Optimization under Polyak{\L}ojasiewicz Condition »
Lesi Chen · Boyuan Yao · Luo Luo 
2022 Spotlight: QuasiNewton Methods for Saddle Point Problems »
Chengchang Liu · Luo Luo 
2020 Poster: Efficient Projectionfree Algorithms for Saddle Point Problems »
Cheng Chen · Luo Luo · Weinan Zhang · Yong Yu 
2020 Poster: Stochastic Recursive Gradient Descent Ascent for Stochastic NonconvexStronglyConcave Minimax Problems »
Luo Luo · Haishan Ye · Zhichao Huang · Tong Zhang 
2020 Poster: Decentralized Accelerated Proximal Gradient Descent »
Haishan Ye · Ziang Zhou · Luo Luo · Tong Zhang