Skip to yearly menu bar Skip to main content


Oral

A Single-Loop Accelerated Extra-Gradient Difference Algorithm with Improved Complexity Bounds for Constrained Minimax Optimization

Yuanyuan Liu · Fanhua Shang · Weixin An · Junhao Liu · Hongying Liu · Zhouchen Lin

Hall C2 (level 1 gate 9 south of food court)

Abstract: In this paper, we propose a novel extra-gradient difference acceleration algorithm for solving constrained nonconvex-nonconcave (NC-NC) minimax problems. In particular, we design a new extra-gradient difference step to obtain an important quasi-cocoercivity property, which plays a key role to significantly improve the convergence rate in the constrained NC-NC setting without additional structural assumption. Then momentum acceleration is also introduced into our dual accelerating update step. Moreover, we prove that, to find an ϵ-stationary point of the function f, our algorithm attains the complexity O(ϵ2) in the constrained NC-NC setting, while the best-known complexity bound is O~(ϵ4), where O~() hides logarithmic factors compared to O(). As the special cases of the constrained NC-NC setting, our algorithm can also obtain the same complexity O(ϵ2) for both the nonconvex-concave (NC-C) and convex-nonconcave (C-NC) cases, while the best-known complexity bounds are O~(ϵ2.5) for the NC-C case and O~(ϵ4) for the C-NC case. For fair comparison with existing algorithms, we also analyze the complexity bound to find ϵ-stationary point of the primal function ϕ for the constrained NC-C problem, which shows that our algorithm can improve the complexity bound from O~(ϵ3) to O(ϵ2). To the best of our knowledge, this is the first time that the proposed algorithm improves the best-known complexity bounds from O(ϵ4) and O~(ϵ3) to O(ϵ2) in both the NC-NC and NC-C settings.

Chat is not available.