Poster
Efficient Sign-Based Optimization: Accelerating Convergence via Variance Reduction
Wei Jiang · Sifan Yang · Wenhao Yang · Lijun Zhang
West Ballroom A-D #6011
Abstract:
Sign stochastic gradient descent (signSGD) is a communication-efficient method that transmits only the sign of stochastic gradients for parameter updating. Existing literature has demonstrated that signSGD can achieve a convergence rate of , where represents the dimension and is the iteration number. In this paper, we improve this convergence rate to by introducing the Sign-based Stochastic Variance Reduction (SSVR) method, which employs variance reduction estimators to track gradients and leverages their signs to update. For finite-sum problems, our method can be further enhanced to achieve a convergence rate of , where denotes the number of component functions. Furthermore, we investigate the heterogeneous majority vote in distributed settings and introduce two novel algorithms that attain improved convergence rates of and respectively, outperforming the previous results of and , where represents the number of nodes. Numerical experiments across different tasks validate the effectiveness of our proposed methods.
Chat is not available.