Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Decentralization and Trustworthy Machine Learning in Web3: Methodologies, Platforms, and Applications

Communication-efficient Decentralized Deep Learning

Fateme Fotouhi · Aditya Balu · Zhanhong Jiang · Yasaman Esfandiari · Salman Jahani · Soumik Sarkar


Abstract:

Decentralized deep learning algorithms leverage peer-to-peer communication of model parameters and/or gradients over communication graphs among the learning agents with access to their private data sets. The majority of the studies in this area focuses on achieving high accuracy, many at the expense of increased communication overhead among the agents. However, large peer-to-peer communication overhead often becomes a practical challenge, especially in harsh environments such as for an underwater sensor network. In this paper, we aim to reduce communication overhead while achieving similar performance as the state-of-the-art algorithms. To achieve this, we use the concept of Minimum Connected Dominating Set from graph theory that is applied in ad hoc wireless networks to address communication overhead issues. Specifically, we propose a new decentralized deep learning algorithm called minimum connected Dominating Set Model Aggregation (DSMA). We investigate the efficacy of our method for different communication graph topologies with a small to large number of agents using varied neural network model architectures. Empirical results on benchmark data sets show a significant (up to 100X) reduction in communication time while preserving the accuracy or in some cases increasing it compared to the state-of-the-art methods. We also present an analysis to show the convergence of our proposed algorithm.

Chat is not available.