Skip to yearly menu bar Skip to main content


Poster
in
Workshop: OPT 2023: Optimization for Machine Learning

A proximal augmented Lagrangian based algorithm for federated learning with constraints

Chuan He · Le Peng · Ju Sun


Abstract:

This paper considers federated learning (FL) with constraints where the central server and all local clients collectively minimize a sum of local objective functions subject to inequality constraints. To train the model without moving local data at clients to the central server, we propose an FL framework that each local client performs multiple updates using the local objective and local constraints, while the central server handles the global constraints and performs aggregation based on the updated local models. In particular, we develop a proximal augmented Lagrangian (AL) based algorithm, where the subproblems are solved by an inexact alternating direction method of multipliers (ADMM) in a federated fashion. Under mild assumptions, we establish the worst-case complexity bounds of the proposed algorithm. Our numerical experiments demonstrate the practical advantages of our algorithm in solving linearly constrained quadratic programming and performing Neyman-Pearson classification in the context of FL.

Chat is not available.