Skip to yearly menu bar Skip to main content


Oral
in
Workshop: Federated Learning: Recent Advances and New Challenges

Federated Learning with Online Adaptive Heterogeneous Local Models

Hanhan Zhou · Tian Lan · Guru Prasadh Venkataramani · Wenbo Ding


Abstract: In Federated Learning, one of the biggest challenges is that client devices often have drastically different computation and communication resources for local updates. To this end, recent research efforts have focused on training heterogeneous local models that are obtained by adaptively pruning a shared global model. Despite the empirical success, theoretical analysis of the convergence of these heterogeneous FL algorithms remains an open question. In this paper, we establish sufficient conditions for any FL algorithms with heterogeneous local models to converge to a neighborhood of a stationary point of standard FL at a rate of $O(\frac{1}{\sqrt{Q}})$. For general smooth cost functions and under standard assumptions, our analysis illuminates two key factors impacting the optimality gap between heterogeneous and standard FL: pruning-induced noise and minimum coverage index, advocating a joint design strategy of local models' pruning masks in heterogeneous FL algorithms. The results are numerically validated on MNIST and CIFAR-10 datasets.

Chat is not available.