Timezone: »

 
Invited Talk: Peter Kairouz - The Fundamental Price of Secure Aggregation in Differentially Private Federated Learning
Peter Kairouz

In this talk, we consider the problem of training a machine learning model with distributed differential privacy (DP) where secure aggregation (SecAgg) is used to ensure that the server only sees the noisy sum of model updates in every training round. Taking into account the linearity constraints imposed by SecAgg, we characterize the optimal communication cost required to obtain the best accuracy achievable under central DP (i.e. under a fully trusted server and no communication constraints), and we derive a simple and efficient scheme that achieves the optimal bandwidth. We evaluate the optimal scheme on real-world federated learning tasks to show that we can reduce the communication cost to under 1.78 bits per parameter in realistic privacy settings without decreasing test-time performance. We conclude the talk with a few important and non-trivial open research directions.

Author Information

Peter Kairouz (Google)

More from the Same Authors