Oral
A Single-Loop Accelerated Extra-Gradient Difference Algorithm with Improved Complexity Bounds for Constrained Minimax Optimization
Yuanyuan Liu · Fanhua Shang · Weixin An · Junhao Liu · Hongying Liu · Zhouchen Lin
Hall C2 (level 1 gate 9 south of food court)
Abstract:
In this paper, we propose a novel extra-gradient difference acceleration algorithm for solving constrained nonconvex-nonconcave (NC-NC) minimax problems. In particular, we design a new extra-gradient difference step to obtain an important quasi-cocoercivity property, which plays a key role to significantly improve the convergence rate in the constrained NC-NC setting without additional structural assumption. Then momentum acceleration is also introduced into our dual accelerating update step. Moreover, we prove that, to find an -stationary point of the function , our algorithm attains the complexity in the constrained NC-NC setting, while the best-known complexity bound is , where hides logarithmic factors compared to . As the special cases of the constrained NC-NC setting, our algorithm can also obtain the same complexity for both the nonconvex-concave (NC-C) and convex-nonconcave (C-NC) cases, while the best-known complexity bounds are for the NC-C case and for the C-NC case. For fair comparison with existing algorithms, we also analyze the complexity bound to find -stationary point of the primal function for the constrained NC-C problem, which shows that our algorithm can improve the complexity bound from to . To the best of our knowledge, this is the first time that the proposed algorithm improves the best-known complexity bounds from and to in both the NC-NC and NC-C settings.
Chat is not available.