Skip to yearly menu bar Skip to main content


Poster
in
Workshop: New Frontiers in Federated Learning: Privacy, Fairness, Robustness, Personalization and Data Ownership

Bayesian SignSGD Optimizer for Federated Learning

Paulo Ferreira · Pablo Silva · Vinicius Gottin


Abstract:

Federated Learning is a distributed Machine Learning framework aimed at training a global model by sharing edge nodes' locally trained models instead of their datasets. This presents three major challenges: communication between edge nodes and the central node; heterogeneity of edge nodes (e.g. availability, computing, datasets); and security. In this paper we focus on the communication challenge, which is two-fold: decreasing the number of communication rounds; and compressing the information sent back and forth between edge nodes and the central node. Particularly, we are interested in cases where strict constraints over the allowed network traffic of gradients may apply – e.g. frequent training of predictive models for globally distributed devices. The recent success of 1-bit compressor (e.g. majority voting SignSGD) is promising; however, such high-compression methods are known to have slow (or problematic) convergence. We propose a Bayesian framework, named BB-SignSGD, encompassing 1-bit compressors for a principled and flexible choice of how much information to carry from previous communication rounds during central aggregation. We prove that majority voting SignSGD is a special case of our framework when particular choices are taken within it. We present results from extensive experiments in five different datasets. We show that, compared to majority voting SignSGD, other choices within BB-SignSGD support higher learning rates to achieve faster convergence, competitive even with uncompressed communication.

Chat is not available.