Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Federated Learning: Recent Advances and New Challenges

FLUTE: A Scalable, Extensible Framework for High-Performance Federated Learning Simulations

Mirian Hipolito Garcia · Andre Manoel · Daniel Madrigal · Robert Sim · Dimitrios Dimitriadis


Abstract:

In this paper we introduce "Federated Learning Utilities and Tools for Experimentation'' (FLUTE), a high-performance open source platform for federated learning research and offline simulations. The goal of FLUTE is to enable rapid prototyping and simulation of new federated learning algorithms at scale, including novel optimization, privacy, and communications strategies. We describe the architecture of FLUTE, enabling arbitrary federated modeling schemes to be realized, we compare the platform with other state-of-the-art platforms, and we describe available features of FLUTE for experimentation in core areas of active research, such as optimization, privacy, and scalability. A comparison with other established platforms shows speed-ups up to 42x and savings in memory footprint of 3x. A sample of the platform capabilities is presented in the Appendix for a range of tasks and other functionality such as scaling and a variety of federated optimizers.

Chat is not available.