Poster
SAPD+: An Accelerated Stochastic Method for Nonconvex-Concave Minimax Problems
Xuan Zhang · Necdet Serhat Aybat · Mert Gurbuzbalaban
Hall J (level 1) #823
Keywords: [ saddle point problems ] [ nonconvex optimization ] [ accelerated methods ] [ stochastic gradient ]
Abstract:
We propose a new stochastic method SAPD+ for solving nonconvex-concave minimax problems of the form minmaxL(x,y)=f(x)+Φ(x,y)−g(y)minmaxL(x,y)=f(x)+Φ(x,y)−g(y), where f,gf,g are closed convex and Φ(x,y)Φ(x,y) is a smooth function that is weakly convex in xx, (strongly) concave in yy. For both strongly concave and merely concave settings, SAPD+ achieves the best known oracle complexities of O(Lκyϵ−4)O(Lκyϵ−4) and O(L3ϵ−6)O(L3ϵ−6), respectively, without assuming compactness of the problem domain, where κyκy is the condition number, and LL is the Lipschitz constant. We also propose SAPD+ with variance reduction, which enjoys the best known oracle complexity of O(Lκ2yϵ−3)O(Lκ2yϵ−3) for weakly convex-strongly concave setting. We demonstrate the efficiency of SAPD+ on a distributionally robust learning problem with a nonconvex regularizer and also on a multi-class classification problem in deep learning.
Chat is not available.