Timezone: »

Bregman Alternating Direction Method of Multipliers
Huahua Wang · Arindam Banerjee

Thu Dec 11 11:00 AM -- 03:00 PM (PST) @ Level 2, room 210D #None
The mirror descent algorithm (MDA) generalizes gradient descent by using a Bregman divergence to replace squared Euclidean distance. In this paper, we similarly generalize the alternating direction method of multipliers (ADMM) to Bregman ADMM (BADMM), which allows the choice of different Bregman divergences to exploit the structure of problems. BADMM provides a unified framework for ADMM and its variants, including generalized ADMM, inexact ADMM and Bethe ADMM. We establish the global convergence and the $O(1/T)$ iteration complexity for BADMM. In some cases, BADMM can be faster than ADMM by a factor of $O(n/\ln n)$ where $n$ is the dimensionality. In solving the linear program of mass transportation problem, BADMM leads to massive parallelism and can easily run on GPU. BADMM is several times faster than highly optimized commercial software Gurobi.

Author Information

Huahua Wang (University of Minnesota, Twin Cites)
Arindam Banerjee (University of Minnesota, Twin Cities)

Arindam Banerjee is a Professor at the Department of Computer & Engineering and a Resident Fellow at the Institute on the Environment at the University of Minnesota, Twin Cities. His research interests are in machine learning, data mining, and applications in complex real-world problems in different areas including climate science, ecology, recommendation systems, text analysis, and finance. He has won several awards, including the NSF CAREER award (2010), the IBM Faculty Award (2013), and six best paper awards in top-tier conferences.

More from the Same Authors