Abstract:
This paper studies minimax optimization problems , where is -strongly convex with respect to , -strongly concave with respect to and -smooth. Zhang et al. \cite{zhang2019lower}
provided the following lower bound of the gradient complexity for
any first-order method:
This paper proposes a new algorithm and proved a gradient complexity bound of
where . This improves over the best known upper bound
by Lin et al. \cite{lin2020near}. Our bound achieves linear convergence rate and tighter dependency on condition numbers, especially when
(i.e., the weak interaction regime).
Via simple reduction, our new bound also implies improved bounds for strongly convex-concave problems and convex-concave problems.
When is quadratic, we can further improve the bound to , which matches the lower bound up to a sub-polynomial factor.
Chat is not available.