Timezone: »

Accelerated Federated Optimization with Quantization
Yeojoon Youn · Bhuvesh Kumar · Jacob Abernethy

Fri Dec 02 11:50 AM -- 11:57 AM (PST) @
Event URL: https://openreview.net/forum?id=qbYo1fO6Vrb »
Federated optimization is a new form of distributed training on very large datasets that leverages many devices each containing local data. While decentralized computation can lead to significant speed-ups due to parallelization, some centralization is still required: devices must aggregate their parameter updates through synchronization across the network. The potential for communication bottleneck is significant. The two main methods to tackle this issue are (a) smarter optimization that decreases the frequency of communication rounds and (b) using \emph{compression} techniques such as quantization and sparsification to reduce the number of bits machines need to transmit. In this paper, we provide a novel algorithm, \textbf{Fed}erated optimization algorithm with \textbf{A}cceleration and \textbf{Q}uantization (FedAQ), with improved theoretical guarantees by combining an accelerated method of federated averaging, reducing the number of training and synchronization steps, with an efficient quantization scheme that significantly reduces communication complexity. We show that in a homogeneous strongly convex setting, FedAQ achieves a linear speedup in the number of workers $M$ with only $\Tilde{\mathcal{O}}(M^{\frac{1}{3}})$ communication rounds, significantly smaller than what is required by other quantization-based federated optimization algorithms. Moreover, we empirically verify that our algorithm performs better than current methods.

Author Information

Yeojoon Youn (Georgia Institute of Technology)
Bhuvesh Kumar (Georgia Institute of Technology)
Jacob Abernethy (Georgia Institute of Technology)

More from the Same Authors