학술논문

Compressed Client Selection for Efficient Communication in Federated Learning
Document Type
Conference
Source
2023 IEEE 20th Consumer Communications & Networking Conference (CCNC) Consumer Communications & Networking Conference (CCNC), 2023 IEEE 20th. :508-516 Jan, 2023
Subject
Communication, Networking and Broadcast Technologies
Computing and Processing
Robotics and Control Systems
Training
Performance evaluation
Machine learning algorithms
Federated learning
Fitting
Multilayer perceptrons
Particle measurements
Federated Learning
Communication
Machine Learning
Language
ISSN
2331-9860
Abstract
Federated learning (FL) is a distributed approach that enables collaborative training of a shared machine learning (ML) model for a given task. FL requires bandwidth-demanding communication between devices and a central server, which is a cause of many issues such as communication bottlenecks and scaling in the network. Therefore, we introduce the CCS (Compressed Client Selection) algorithm aimed at decreasing the overall communication costs for fitting a model in the FL environment. CCS employs a biased client selection strategy that reduces the number of devices training the ML model and the number of rounds required to reach convergence. In addition, the compression method Count Sketch is implemented to reduce the overhead in client-to-server communication. A use case on the Human Activity Recognition dataset is performed to evaluate CCS and compare it with other state-of-the-art approaches. Experimental evaluations show that CCS efficiently reduces the overall communication overhead for fitting a model and its convergence in a FL environment. In particular, CCS reduces up to 90% the communication overhead compared to literature approaches while providing good convergence even in scenarios where the data are not-independently and identically distributed among client devices.