Federated Learning via Decentralized Dataset Distillation in Resource-Constrained Edge Environments

Rui Song, Dai Liu, Dave Zhenyu Chen, Andreas Festag, Carsten Trinitis, Martin Schulz, Alois Knoll

Publikation: Beitrag in Buch/Bericht/KonferenzbandKonferenzbeitragBegutachtung

13 Zitate (Scopus)

Abstract

In federated learning, all networked clients contribute to the model training cooperatively. However, with model sizes increasing, even sharing the trained partial models often leads to severe communication bottlenecks in underlying networks, especially when communicated iteratively. In this paper, we introduce a federated learning framework FedD3 requiring only one-shot communication by integrating dataset distillation instances. Instead of sharing model updates in other federated learning approaches, FedD3 allows the connected clients to distill the local datasets independently, and then aggregates those decentralized distilled datasets (e.g. a few unrecognizable images) from networks for model training. Our experimental results show that FedD3 significantly outperforms other federated learning frameworks in terms of needed communication volumes, while it provides the additional benefit to be able to balance the trade-off between accuracy and communication cost, depending on usage scenario or target dataset. For instance, for training an AlexNet model on CIFAR-10 with 10 clients under non-independent and identically distributed (Non-IID) setting, FedD3 can either increase the accuracy by over 71% with a similar communication volume, or save 98% of communication volume, while reaching the same accuracy, compared to other one-shot federated learning approaches.

OriginalspracheEnglisch
TitelIJCNN 2023 - International Joint Conference on Neural Networks, Proceedings
Herausgeber (Verlag)Institute of Electrical and Electronics Engineers Inc.
ISBN (elektronisch)9781665488679
DOIs
PublikationsstatusVeröffentlicht - 2023
Veranstaltung2023 International Joint Conference on Neural Networks, IJCNN 2023 - Gold Coast, Australien
Dauer: 18 Juni 202323 Juni 2023

Publikationsreihe

NameProceedings of the International Joint Conference on Neural Networks
Band2023-June

Konferenz

Konferenz2023 International Joint Conference on Neural Networks, IJCNN 2023
Land/GebietAustralien
OrtGold Coast
Zeitraum18/06/2323/06/23

Fingerprint

Untersuchen Sie die Forschungsthemen von „Federated Learning via Decentralized Dataset Distillation in Resource-Constrained Edge Environments“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren