`

Timezone: »

 
Diurnal or Nocturnal? Federated Learning from Periodically Shifting Distributions
Chen Zhu · Zheng Xu · Mingqing Chen · Jakub Konečný · Andrew S Hard · Tom Goldstein
Event URL: https://openreview.net/forum?id=WRmTnEOk0E »

Federated learning has been deployed to train machine learning models from decentralized client data on mobile devices in practice. The clients available for training are observed to have periodically shifting distributions changing with the time of day, which can cause instability in training and degrade the model performance. In this paper, instead of modeling the distribution shift with a block-cyclic pattern as previous works, we model it with a mixture of distributions that gradually changes between daytime modes and nighttime modes, and find this intuitive model to better match the observations in practical federated learning systems. We propose a Federated Expectation-Maximization algorithm enhanced by Temporal priors of the shifting distribution (FedTEM), which jointly learns a mixture model to infer the mode of each client, while training a network with multiple light-weight branches specializing at different modes. Experiments for image classification on EMNIST and CIFAR datasets, and next word prediction on the Stack Overflow dataset show that the proposed algorithm can effectively mitigate the impact of the distribution shift and significantly improve the final model performance.

Author Information

Chen Zhu (University of Maryland, College Park)
Zheng Xu (Google AI)
Mingqing Chen (Google)
Jakub Konečný
Andrew S Hard (Google)

I'm a Senior Software Engineer at Google, where I currently work on applications of federated learning. I hold a PhD in high-energy physics from the University of Wisconsin, and spent 5 years searching for the Higgs boson at CERN.

Tom Goldstein (Rice University)

More from the Same Authors