Skip to yearly menu bar Skip to main content


Poster

Towards Optimal Communication Complexity in Distributed Non-Convex Optimization

Kumar Kshitij Patel · Lingxiao Wang · Blake Woodworth · Brian Bullins · Nati Srebro

Hall J (level 1) #936

Keywords: [ Distributed Optimization ] [ Stochastic Optimization ] [ Non-Convex Optimization ] [ federated learning ] [ Intermittent Communication Setting ] [ lower bounds ]


Abstract: We study the problem of distributed stochastic non-convex optimization with intermittent communication. We consider the full participation setting where M machines work in parallel over R communication rounds and the partial participation setting where M machines are sampled independently every round from some meta-distribution over machines. We propose and analyze a new algorithm that improves existing methods by requiring fewer and lighter variance reduction operations. We also present lower bounds, showing our algorithm is either optimal or almost optimal in most settings. Numerical experiments demonstrate the superior performance of our algorithm.

Chat is not available.