Timezone: »
The present paper develops a novel aggregated gradient approach for distributed machine learning that adaptively compresses the gradient communication. The key idea is to first quantize the computed gradients, and then skip less informative quantized gradient communications by reusing outdated gradients. Quantizing and skipping result in 'lazy' worker-server communications, which justifies the term Lazily Aggregated Quantized gradient that is henceforth abbreviated as LAQ. Our LAQ can provably attain the same linear convergence rate as the gradient descent in the strongly convex case, while effecting major savings in the communication overhead both in transmitted bits as well as in communication rounds. Empirically, experiments with real data corroborate a significant communication reduction compared to existing gradient- and stochastic gradient-based algorithms.
Author Information
Jun Sun (Zhejiang University)
Tianyi Chen (Rensselaer Polytechnic Institute)
Georgios Giannakis (University of Minnesota)
Zaiyue Yang (Southern University of Science and Technology)
More from the Same Authors
-
2021 Spotlight: Closing the Gap: Tighter Analysis of Alternating Stochastic Gradient Methods for Bilevel Problems »
Tianyi Chen · Yuejiao Sun · Wotao Yin -
2022 Poster: A Single-timescale Analysis for Stochastic Approximation with Multiple Coupled Sequences »
Han Shen · Tianyi Chen -
2022 Poster: Understanding Benign Overfitting in Gradient-Based Meta Learning »
Lisha Chen · Songtao Lu · Tianyi Chen -
2021 Poster: Closing the Gap: Tighter Analysis of Alternating Stochastic Gradient Methods for Bilevel Problems »
Tianyi Chen · Yuejiao Sun · Wotao Yin -
2021 Poster: CAFE: Catastrophic Data Leakage in Vertical Federated Learning »
Xiao Jin · Pin-Yu Chen · Chia-Yi Hsu · Chia-Mu Yu · Tianyi Chen -
2021 Poster: Heavy Ball Momentum for Conditional Gradient »
Bingcong Li · Alireza Sadeghi · Georgios Giannakis -
2020 Poster: Decentralized TD Tracking with Linear Function Approximation and its Finite-Time Analysis »
Gang Wang · Songtao Lu · Georgios Giannakis · Gerald Tesauro · Jian Sun -
2018 Poster: LAG: Lazily Aggregated Gradient for Communication-Efficient Distributed Learning »
Tianyi Chen · Georgios Giannakis · Tao Sun · Wotao Yin -
2018 Spotlight: LAG: Lazily Aggregated Gradient for Communication-Efficient Distributed Learning »
Tianyi Chen · Georgios Giannakis · Tao Sun · Wotao Yin -
2017 Poster: Solving Most Systems of Random Quadratic Equations »
Gang Wang · Georgios Giannakis · Yousef Saad · Jie Chen -
2016 Poster: Solving Random Systems of Quadratic Equations via Truncated Generalized Gradient Flow »
Gang Wang · Georgios Giannakis