Timezone: »

Revisiting Optimal Convergence Rate for Smooth and Non-convex Stochastic Decentralized Optimization
Kun Yuan · Xinmeng Huang · Yiming Chen · Xiaohan Zhang · Yingya Zhang · Pan Pan

Thu Dec 01 09:00 AM -- 11:00 AM (PST) @ Hall J #931

While numerous effective decentralized algorithms have been proposed with theoretical guarantees and empirical successes, the performance limits in decentralized optimization, especially the influence of network topology and its associated weight matrix on the optimal convergence rate, have not been fully understood. While Lu and Sa have recently provided an optimal rate for non-convex stochastic decentralized optimization using weight matrices associated with linear graphs, the optimal rate with general weight matrices remains unclear. This paper revisits non-convex stochastic decentralized optimization and establishes an optimal convergence rate with general weight matrices. In addition, we also establish the first optimal rate when non-convex loss functions further satisfy the Polyak-Lojasiewicz (PL) condition. Following existing lines of analysis in literature cannot achieve these results. Instead, we leverage the Ring-Lattice graph to admit general weight matrices while maintaining the optimal relation between the graph diameter and weight matrix connectivity. Lastly, we develop a new decentralized algorithm to attain the above two optimal rates up to logarithm factors.

Author Information

Kun Yuan (Peking University)
Xinmeng Huang (University of Pennsylvania)
Yiming Chen (Alibaba Group)
Xiaohan Zhang (University of Pennsylvania, University of Pennsylvania)
Yingya Zhang (Alibaba Group)
Pan Pan (Alibaba Group)

More from the Same Authors