Skip to yearly menu bar Skip to main content


Poster
in
Workshop: New Frontiers in Federated Learning: Privacy, Fairness, Robustness, Personalization and Data Ownership

Personalized Neural Architecture Search for Federated Learning

Minh Hoang · Carl Kingsford


Abstract:

Federated Learning (FL) is a recently proposed learning paradigm for decentralized devices to collaboratively train a predictive model without exchanging private data. Existing FL frameworks, however, assume a one-size-fit-all model architecture to be collectively trained by local devices, which is determined prior to observing their data. Even with good engineering acumen, this often falls apart when local tasks are different and require diverging choices of architecture modelling to learn effectively. This motivates us to develop a novel personalized neural architecture search (NAS) algorithm for FL, which learns a base architecture that can be structurally personalized for quick adaptation to each local task. On several real-world datasets, our algorithm, \textsc{FedPNAS} is able to achieve superior performance compared to other benchmarks on heterogeneous multitask scenarios.

Chat is not available.