Skip to yearly menu bar Skip to main content


Poster

SAPD+: An Accelerated Stochastic Method for Nonconvex-Concave Minimax Problems

Xuan Zhang · Necdet Serhat Aybat · Mert Gurbuzbalaban

Hall J (level 1) #823

Keywords: [ saddle point problems ] [ nonconvex optimization ] [ accelerated methods ] [ stochastic gradient ]


Abstract: We propose a new stochastic method SAPD+ for solving nonconvex-concave minimax problems of the form minmaxL(x,y)=f(x)+Φ(x,y)g(y)minmaxL(x,y)=f(x)+Φ(x,y)g(y), where f,gf,g are closed convex and Φ(x,y)Φ(x,y) is a smooth function that is weakly convex in xx, (strongly) concave in yy. For both strongly concave and merely concave settings, SAPD+ achieves the best known oracle complexities of O(Lκyϵ4)O(Lκyϵ4) and O(L3ϵ6)O(L3ϵ6), respectively, without assuming compactness of the problem domain, where κyκy is the condition number, and LL is the Lipschitz constant. We also propose SAPD+ with variance reduction, which enjoys the best known oracle complexity of O(Lκ2yϵ3)O(Lκ2yϵ3) for weakly convex-strongly concave setting. We demonstrate the efficiency of SAPD+ on a distributionally robust learning problem with a nonconvex regularizer and also on a multi-class classification problem in deep learning.

Chat is not available.