TY - GEN
T1 - Communication Topologies for Decentralized Federated Learning
AU - Dötzer, Michael
AU - Mao, Yixin
AU - Diepold, Klaus
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Conventional federated learning aims at enabling clients to contribute to a global training process while keeping their own data local. However, as the number of devices on the network increases, it can no longer be assumed that there is a central entity with sufficient bandwidth or computing resources to handle the volume of requests. Hence, in this paper, we consider implementing federated learning with different topologies in a network without a central entity. We compare hierarchical and decentralized topologies with varying degrees of interconnectivity. In our experiments, we use 50 clients with small CNNs and MNIST, FashinMNIST or Cifar10 datasets. Our results show that models in a decentralized network can achieve similar performances as models in a centralized network if the topology is carefully chosen. We relate the accuracy of the models to the estimated communication overhead by considering the number of communication connections required for a given topology. These results indicate that cluster topologies can leverage similarities of data distributions and mitigate the communication effort without sacrificing performance. In addition, we present a simple method to estimate the information transfer performance of a topology without empirical testing.
AB - Conventional federated learning aims at enabling clients to contribute to a global training process while keeping their own data local. However, as the number of devices on the network increases, it can no longer be assumed that there is a central entity with sufficient bandwidth or computing resources to handle the volume of requests. Hence, in this paper, we consider implementing federated learning with different topologies in a network without a central entity. We compare hierarchical and decentralized topologies with varying degrees of interconnectivity. In our experiments, we use 50 clients with small CNNs and MNIST, FashinMNIST or Cifar10 datasets. Our results show that models in a decentralized network can achieve similar performances as models in a centralized network if the topology is carefully chosen. We relate the accuracy of the models to the estimated communication overhead by considering the number of communication connections required for a given topology. These results indicate that cluster topologies can leverage similarities of data distributions and mitigate the communication effort without sacrificing performance. In addition, we present a simple method to estimate the information transfer performance of a topology without empirical testing.
KW - Federated learning
KW - clustering applications
KW - network topology
UR - http://www.scopus.com/inward/record.url?scp=85179520905&partnerID=8YFLogxK
U2 - 10.1109/FMEC59375.2023.10306161
DO - 10.1109/FMEC59375.2023.10306161
M3 - Conference contribution
AN - SCOPUS:85179520905
T3 - 2023 8th International Conference on Fog and Mobile Edge Computing, FMEC 2023
SP - 232
EP - 238
BT - 2023 8th International Conference on Fog and Mobile Edge Computing, FMEC 2023
A2 - Quwaider, Muhannad
A2 - Awaysheh, Feras M.
A2 - Jararweh, Yaser
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 8th IEEE International Conference on Fog and Mobile Edge Computing, FMEC 2023
Y2 - 18 September 2023 through 20 September 2023
ER -