TY - GEN
T1 - Trustworthy Federated Learning via Decentralized Consensus Under Communication Constraints
AU - Ye, Wenxuan
AU - An, Xueli
AU - Yan, Xueqiang
AU - Carle, Georg
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - The advent of 6G is anticipated to bring advanced support for decentralized data processing, promoting the exploration of Federated Learning (FL). FL enables collaborative learning among distributed clients without direct access to raw data, offering benefits in communication efficiency and privacy preservation. However, some challenges hinder its widespread adoption in the 6G context, such as anomalous local models' packet loss due to communication resource limitations, and centralized design on data management. To address these obstacles, this work proposes a trustworthy architecture for supporting FL with the following three key contributions. First, our approach develops a robust model aggregation method by incorporating model analysis along with client reputation, withstanding abnormal models and enhancing system resilience. Second, it utilizes all received models, including those partially received because of packet loss, for accuracy optimization while ensuring fair contribution reputations for all participants. Third, our work customizes a consensus mechanism in Distributed Ledger Technology (DLT) for the proposed aggregation rule. This mechanism facilitates transparent and immutable records for data exchanges, and decentralizes the system. Our simulation demonstrates that the proposed architecture accurately identifies outlier models and utilizes incomplete models, enhancing global model accuracy by 13% compared to the method that averages over randomly selected fully-received models. Additionally, when benchmarked against the state-of-the-art Krum algorithm, our approach registers a 5% performance improvement.
AB - The advent of 6G is anticipated to bring advanced support for decentralized data processing, promoting the exploration of Federated Learning (FL). FL enables collaborative learning among distributed clients without direct access to raw data, offering benefits in communication efficiency and privacy preservation. However, some challenges hinder its widespread adoption in the 6G context, such as anomalous local models' packet loss due to communication resource limitations, and centralized design on data management. To address these obstacles, this work proposes a trustworthy architecture for supporting FL with the following three key contributions. First, our approach develops a robust model aggregation method by incorporating model analysis along with client reputation, withstanding abnormal models and enhancing system resilience. Second, it utilizes all received models, including those partially received because of packet loss, for accuracy optimization while ensuring fair contribution reputations for all participants. Third, our work customizes a consensus mechanism in Distributed Ledger Technology (DLT) for the proposed aggregation rule. This mechanism facilitates transparent and immutable records for data exchanges, and decentralizes the system. Our simulation demonstrates that the proposed architecture accurately identifies outlier models and utilizes incomplete models, enhancing global model accuracy by 13% compared to the method that averages over randomly selected fully-received models. Additionally, when benchmarked against the state-of-the-art Krum algorithm, our approach registers a 5% performance improvement.
KW - 6G
KW - Distributed ledger technology
KW - Federated learning
KW - Model aggregation
UR - http://www.scopus.com/inward/record.url?scp=85190251496&partnerID=8YFLogxK
U2 - 10.1109/GCWkshps58843.2023.10464450
DO - 10.1109/GCWkshps58843.2023.10464450
M3 - Conference contribution
AN - SCOPUS:85190251496
T3 - 2023 IEEE Globecom Workshops, GC Wkshps 2023
SP - 38
EP - 43
BT - 2023 IEEE Globecom Workshops, GC Wkshps 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2023 IEEE Globecom Workshops, GC Wkshps 2023
Y2 - 4 December 2023 through 8 December 2023
ER -