Poster
SAPD+: An Accelerated Stochastic Method for Nonconvex-Concave Minimax Problems
Xuan Zhang · Necdet Serhat Aybat · Mert Gurbuzbalaban
Hall J (level 1) #823
Keywords: [ saddle point problems ] [ nonconvex optimization ] [ accelerated methods ] [ stochastic gradient ]
Abstract:
We propose a new stochastic method SAPD+ for solving nonconvex-concave minimax problems of the form minmaxL(x,y)=f(x)+Φ(x,y)−g(y), where f,g are closed convex and Φ(x,y) is a smooth function that is weakly convex in x, (strongly) concave in y. For both strongly concave and merely concave settings, SAPD+ achieves the best known oracle complexities of O(Lκyϵ−4) and O(L3ϵ−6), respectively, without assuming compactness of the problem domain, where κy is the condition number, and L is the Lipschitz constant. We also propose SAPD+ with variance reduction, which enjoys the best known oracle complexity of O(Lκ2yϵ−3) for weakly convex-strongly concave setting. We demonstrate the efficiency of SAPD+ on a distributionally robust learning problem with a nonconvex regularizer and also on a multi-class classification problem in deep learning.
Chat is not available.