Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2022: Optimization for Machine Learning

ProxSkip for Stochastic Variational Inequalities: A Federated Learning Algorithm for Provable Communication Acceleration

Siqi Zhang · Nicolas Loizou


Abstract: Recently Mishchenko et al. proposed and analyzed ProxSkip, a provably efficient method for minimizing the sum of a smooth $(f)$ and an expensive nonsmooth proximable $(R)$ function (i.e. $\min_{x \in \mathbb{R}^d} f(x) + R(x)$). The main advantage of ProxSkip, is that in the federated learning (FL) setting, offers provably an effective acceleration of communication complexity.This work extends this approach to the more general regularized variational inequality problems (VIP). In particular, we propose ProxSkip-VIP algorithm, which generalizes the original ProxSkip framework to VIP, and we provide convergence guarantees for a class of structured non-monotone problems. In the federated learning setting, we explain how our approach achieves acceleration in terms of the communication complexity over existing state-of-the-art FL algorithms.

Chat is not available.