Skip to yearly menu bar Skip to main content


Poster

Dykstra's Algorithm, ADMM, and Coordinate Descent: Connections, Insights, and Extensions

Ryan Tibshirani

Pacific Ballroom #170

Keywords: [ Convex Optimization ] [ Sparsity and Compressed Sensing ] [ Regularization ]


Abstract:

We study connections between Dykstra's algorithm for projecting onto an intersection of convex sets, the augmented Lagrangian method of multipliers or ADMM, and block coordinate descent. We prove that coordinate descent for a regularized regression problem, in which the penalty is a separable sum of support functions, is exactly equivalent to Dykstra's algorithm applied to the dual problem. ADMM on the dual problem is also seen to be equivalent, in the special case of two sets, with one being a linear subspace. These connections, aside from being interesting in their own right, suggest new ways of analyzing and extending coordinate descent. For example, from existing convergence theory on Dykstra's algorithm over polyhedra, we discern that coordinate descent for the lasso problem converges at an (asymptotically) linear rate. We also develop two parallel versions of coordinate descent, based on the Dykstra and ADMM connections.

Live content is unavailable. Log in and register to view live content