학술논문

Federated Learning With Client Selection and Gradient Compression in Heterogeneous Edge Systems
Document Type
Periodical
Source
IEEE Transactions on Mobile Computing IEEE Trans. on Mobile Comput. Mobile Computing, IEEE Transactions on. 23(5):5446-5461 May, 2024
Subject
Computing and Processing
Communication, Networking and Broadcast Technologies
Signal Processing and Analysis
Training
Convergence
Adaptation models
Optimization
Mobile computing
Quantization (signal)
Federated learning
Edge computing
federated learning
capability heterogeneity
statistical heterogeneity
Language
ISSN
1536-1233
1558-0660
2161-9875
Abstract
Federated learning (FL) has recently gained tremendous attention in edge computing and Internet of Things, due to its capability of enabling distributed clients to cooperatively train models while keeping raw data locally. However, the existing works usually suffer from limited communication resources, dynamic network conditions and heterogeneous client properties, which hinder efficient FL. To simultaneously tackle the above challenges, we propose a heterogeneity-aware FL framework, called FedCG, with adaptive client selection and gradient compression. Specifically, FedCG introduces diversity to client selection and aims to select a representative client subset considering statistical heterogeneity. These selected clients are assigned different compression ratios based on heterogeneous and time-varying capabilities. After local training, they upload sparse model updates matching their capabilities for global aggregation, which can effectively reduce the communication cost and mitigate the straggler effect. More importantly, instead of naively combining client selection and gradient compression, we highlight that their decisions are tightly coupled and indicate the necessity of joint optimization. We theoretically analyze the impact of both client selection and gradient compression on convergence performance. Guided by the convergence rate, we develop an iteration-based algorithm to jointly optimize client selection and compression ratio decision using submodular maximization and linear programming. On this basis, we propose the quantized extension of FedCG, termed Q-FedCG, which further adjusts quantization levels based on gradient innovation. Extensive experiments on both real-world prototypes and simulations show that FedCG and its extension can provide up to 6.4× speedup.