Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Federated Learning: Recent Advances and New Challenges

Towards Provably Personalized Federated Learning via Threshold-Clustering of Similar Clients

Mariel A Werner · Lie He · Sai Praneeth Karimireddy · Michael Jordan · Martin Jaggi


Abstract: Clustering clients with similar objectives together and learning a model per cluster is an intuitive and interpretable approach to personalization in federated learning (PFL). However, doing so with provable and optimal guarantees has remained an open challenge. In this work, we formalize personalized federated learning as a stochastic optimization problem where the stochastic gradients on a client may correspond to one of $K$ distributions. In such a setting, we show that using i) a simple thresholding based clustering algorithm, and ii) local client momentum obtains optimal convergence guarantees. In fact, our rates asymptotically match those obtained if we knew the true underlying clustering of the clients. Further, we extend our algorithm to the decentralized setting where each node performs clustering using itself as the center.

Chat is not available.