Timezone: »

 
Refined Convergence and Topology Learning for Decentralized Optimization with Heterogeneous Data
Batiste Le bars · Aurélien Bellet · Marc Tommasi · Erick Lavoie · Anne-marie Kermarrec
Event URL: https://openreview.net/forum?id=WBm2z8jrgtP »

One of the key challenges in decentralized and federated learning is to design algorithms that efficiently deal with highly heterogeneous data distributions across agents. In this paper, we revisit the analysis of Decentralized Stochastic Gradient Descent algorithm (D-SGD) under data heterogeneity. We first exhibit the key role played by a new quantity, called neighborhood heterogeneity, on the convergence rate of D-SGD. Neighborhood heterogeneity provides a natural criterion to learn data-dependent and sparse topologies that reduce the detrimental effect of data heterogeneity on the convergence of D-SGD. For the important case of classification with label skew, we formulate the problem of learning a topology as a tractable optimization problem that we solve with a Frank-Wolfe algorithm. As illustrated over a set of experiments, the learned sparse topology is showed to balance the convergence speed and the per-iteration communication costs of D-SGD.

Author Information

Batiste Le bars (Ecole Normale Superieure)
Aurélien Bellet (INRIA)
Marc Tommasi (INRIA)
Erick Lavoie
Anne-marie Kermarrec (School of Computer and Communication Sciences, EPFL - EPF Lausanne)

More from the Same Authors