Skip to yearly menu bar Skip to main content


Poster
in
Workshop: New Frontiers in Federated Learning: Privacy, Fairness, Robustness, Personalization and Data Ownership

Sharp Bounds for FedAvg (Local SGD)

Margalit Glasgow · Honglin Yuan · Tengyu Ma


Abstract:

Federated Averaging (FedAvg), also known as Local SGD, is one of the most popular and the de facto algorithm in Federated Learning (FL). This distributed optimization algorithm involves running stochastic gradient descent (SGD) simultaneously at many machines, and infrequently averaging the iterates across the machines. Despite its simplicity and popularity, the convergence rate of FedAvg has thus far been undetermined. In this work, we provide a lower bound for FedAvg that matches the existing upper bound in convex homogeneous setting. As an extension, we also establish lower bound in heterogeneous (non-iid) setting that matches the existing upper bound up to the definition of heterogeneity measure. Our analysis is based on a sharp characterization of the drift of the expectation of a SGD iterate.

Chat is not available.