Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2022: Optimization for Machine Learning

Nesterov Meets Optimism: Rate-Optimal Optimistic-Gradient-Based Method for Stochastic Bilinearly-Coupled Minimax Optimization

Chris Junchi Li · Angela Yuan · Gauthier Gidel · Michael Jordan


Abstract:

We provide a novel first-order optimization for bilinearly-coupled strongly-convex-concave minimax optimization we call Accelerated Optimistic Gradient (AG-OG). The main idea of our algorithm is to leverage the structure of the considered minimax problem by using Nesterov’s acceleration on the individual parts and optimism on the coupling term of the objective. We first motivate our method by showing that the dynamics of its continuous version correspond to a linear combination of the ODEs of the dynamics of the Optimistic Gradient and the dynamics of Nesterov‘s acceleration. This continuous time approach allows us to showcase the main properties of our method that will eventually guide our analysis in the discrete case. Furthermore by properly restarting AG-OG, we show that we can achieve optimal (up to a constant) convergence rates with respect to the conditioning of the coupling and individual parts in the stochastic and deterministic cases.

Chat is not available.