This is the public, feature-limited version of the conference webpage. After Registration and login please visit the full version.

IDEAL: Inexact DEcentralized Accelerated Augmented Lagrangian Method

Yossi Arjevani, Joan Bruna, Bugra Can, Mert Gurbuzbalaban, Stefanie Jegelka, Hongzhou Lin

Spotlight presentation: Orals & Spotlights Track 21: Optimization
on 2020-12-09T07:20:00-08:00 - 2020-12-09T07:30:00-08:00
Poster Session 4 (more posters)
on 2020-12-09T09:00:00-08:00 - 2020-12-09T11:00:00-08:00
Abstract: We introduce a framework for designing primal methods under the decentralized optimization setting where local functions are smooth and strongly convex. Our approach consists of approximately solving a sequence of sub-problems induced by the accelerated augmented Lagrangian method, thereby providing a systematic way for deriving several well-known decentralized algorithms including EXTRA and SSDA. When coupled with accelerated gradient descent, our framework yields a novel primal algorithm whose convergence rate is optimal and matched by recently derived lower bounds. We provide experimental results that demonstrate the effectiveness of the proposed algorithm on highly ill-conditioned problems.

Preview Video and Chat

To see video, interact with the author and ask questions please use registration and login.