Skip to yearly menu bar Skip to main content


Spotlight Poster

Stabilized Proximal Point Methods for Federated Optimization

Xiaowen Jiang · Anton Rodomanov · Sebastian Stich

West Ballroom A-D #6009
[ ]
Thu 12 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

In developing efficient optimization algorithms, it is crucial to account for communication constraints---a significant challenge in modern federated learning settings. The best-known communication complexity among non-accelerated algorithms is achieved by DANE, a distributed proximal point algorithm that solves local subproblems in each iteration and that can exploit Hessian similarity. However, to achieve such communication efficiency, the local subproblems must be solved more accurately across communication rounds. Inspired by the extra-gradient method, in this work, we i) propose a novel distributed algorithm S-DANE. This method adopts a more stabilized reference point in the proximal step and matches DANE's deterministic communication complexity. Moreover, the accuracy level of the subproblem is always fixed during the optimization process, leading to enhanced local computation efficiency. Furthermore, it supports partial client participation and arbitrary stochastic local solvers, making it more attractive in practice. We further ii) accelerate S-DANE, and show that the resulting algorithm achieves the best-known communication complexity among all existing methods for convex distributed optimization. The efficient computation property is also retained.

Live content is unavailable. Log in and register to view live content