KOR

e-Article

Federated Learning Under Heterogeneous and Correlated Client Availability
Document Type
Periodical
Source
IEEE/ACM Transactions on Networking IEEE/ACM Trans. Networking Networking, IEEE/ACM Transactions on. 32(2):1451-1460 Apr, 2024
Subject
Communication, Networking and Broadcast Technologies
Computing and Processing
Signal Processing and Analysis
Servers
Training
Correlation
Convergence
Markov processes
Federated learning
Smart phones
correlated client availability
Markov chains
Language
ISSN
1063-6692
1558-2566
Abstract
In Federated Learning (FL), devices– also referred to as clients– can exhibit heterogeneous availability patterns, often correlated over time and with other clients. This paper addresses the problem of heterogeneous and correlated client availability in FL. Our theoretical analysis is the first to demonstrate the negative impact of correlation on FL algorithms’ convergence rate and highlights a trade-off between optimization error (related to convergence speed) and bias error (indicative of model quality). To optimize this trade-off, we propose Correlation-Aware FL (CA-Fed), a novel algorithm that dynamically balances the competing objectives of fast convergence and minimal model bias. CA-Fed achieves this by dynamically adjusting the aggregation weight assigned to each client and selectively excluding clients with high temporal correlation and low availability. Experimental evaluations on diverse datasets demonstrate the effectiveness of CA-Fed compared to state-of-the-art methods. Specifically, CA-Fed achieves the best trade-off between training time and test accuracy. By dynamically handling clients with high temporal correlation and low availability, CA-Fed emerges as a promising solution to mitigate the detrimental impact of correlated client availability in FL.