Training Heterogeneous Client Models using Knowledge Distillation in Serverless Federated Learning

Mohak Chadha, Pulkit Khera, Jianfeng Gu, Osama Abboud, Michael Gerndt

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Federated Learning (FL) is an emerging machine learning paradigm that enables the collaborative training of a shared global model across distributed clients while keeping the data decentralized. Recent works on designing systems for efficient FL have shown that utilizing serverless computing technologies, particularly Function-as-a-Service (FaaS) for FL, can enhance resource efficiency, reduce training costs, and alleviate the complex infrastructure management burden on data holders. However, existing serverless FL systems implicitly assume a uniform global model architecture across all participating clients during training. This assumption fails to address fundamental challenges in practical FL due to the resource and statistical data heterogeneity among FL clients. To address these challenges and enable heterogeneous client models in serverless FL, we utilize Knowledge Distillation (KD) in this paper. Towards this, we propose novel optimized serverless workflows for two popular conventional federated KD techniques, i.e., FedMD and FedDF. We implement these workflows by introducing several extensions to an open-source serverless FL system called FedLess. Moreover, we comprehensively evaluate the two strategies on multiple datasets across varying levels of client data heterogeneity using heterogeneous client models with respect to accuracy, fine-grained training times, and costs. Results from our experiments demonstrate that server-less FedDF is more robust to extreme non-IID data distributions, is faster, and leads to lower costs than serverless FedMD. In addition, compared to the original implementation, our optimizations for particular steps in FedMD and FedDF lead to an average speedup of 3.5x and 1.76x across all datasets.

Original languageEnglish
Title of host publication39th Annual ACM Symposium on Applied Computing, SAC 2024
PublisherAssociation for Computing Machinery
Pages997-1006
Number of pages10
ISBN (Electronic)9798400702433
DOIs
StatePublished - 8 Apr 2024
Event39th Annual ACM Symposium on Applied Computing, SAC 2024 - Avila, Spain
Duration: 8 Apr 202412 Apr 2024

Publication series

NameProceedings of the ACM Symposium on Applied Computing

Conference

Conference39th Annual ACM Symposium on Applied Computing, SAC 2024
Country/TerritorySpain
CityAvila
Period8/04/2412/04/24

Keywords

  • deep learning
  • FaaS
  • federated learning
  • knowledge distillation
  • scalability of learning algorithms
  • serverless computing

Fingerprint

Dive into the research topics of 'Training Heterogeneous Client Models using Knowledge Distillation in Serverless Federated Learning'. Together they form a unique fingerprint.

Cite this