Timezone: »

 
Poster
Federated Learning from Pre-Trained Models: A Contrastive Learning Approach
Yue Tan · Guodong Long · Jie Ma · LU LIU · Tianyi Zhou · Jing Jiang

Wed Nov 30 09:00 AM -- 11:00 AM (PST) @ Hall J #203

Federated Learning (FL) is a machine learning paradigm that allows decentralized clients to learn collaboratively without sharing their private data. However, excessive computation and communication demands pose challenges to current FL frameworks, especially when training large-scale models. To prevent these issues from hindering the deployment of FL systems, we propose a lightweight framework where clients jointly learn to fuse the representations generated by multiple fixed pre-trained models rather than training a large-scale model from scratch. This leads us to a more practical FL problem by considering how to capture more client-specific and class-relevant information from the pre-trained models and jointly improve each client's ability to exploit those off-the-shelf models. Here, we design a Federated Prototype-wise Contrastive Learning (FedPCL) approach which shares knowledge across clients through their class prototypes and builds client-specific representations in a prototype-wise contrastive manner. Sharing prototypes rather than learnable model parameters allows each client to fuse the representations in a personalized way while keeping the shared knowledge in a compact form for efficient communication. We perform a thorough evaluation of the proposed FedPCL in the lightweight framework, measuring and visualizing its ability to fuse various pre-trained models on popular FL datasets.

Author Information

Yue Tan (University of Technology Sydney)
Guodong Long (University of Technology Sydney (UTS))
Jie Ma (University of Technology Sydney)
LU LIU (Google)

Lu Liu is a 3-rd year Ph.D. student from University of Technology Sydney. Her research interests lie in Machine Learning, Meta-learning and Low-shot learning.

Tianyi Zhou (University of Washington, Seattle)

Tianyi Zhou is a Ph.D. student in Computer Science at University of Washington and a member of MELODI lab led by Prof. Jeff A. Bilmes. He will be joining University of Maryland, College Park as a tenure-track assistant professor at the Department of Computer Science and affiliated with UMIACS in 2022. His research interests are in machine learning, optimization, and natural language processing. He has published ~60 papers at NeurIPS, ICML, ICLR, AISTATS, EMNLP, NAACL, COLING, KDD, ICDM, AAAI, IJCAI, ISIT, Machine Learning (Springer), IEEE TIP/TNNLS/TKDE, etc. He is the recipient of the Best Student Paper Award at ICDM 2013 and the 2020 IEEE TCSC Most Influential Paper Award.

Jing Jiang (University of Technology Sydney)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors