학술논문

Communication-Efficient Federated Learning with Channel-Aware Sparsification over Wireless Networks
Document Type
Conference
Source
2023 57th Annual Conference on Information Sciences and Systems (CISS) Information Sciences and Systems (CISS), 2023 57th Annual Conference on. :1-6 Mar, 2023
Subject
Communication, Networking and Broadcast Technologies
Computing and Processing
Photonics and Electrooptics
Robotics and Control Systems
Signal Processing and Analysis
Training
Time division multiple access
Quantization (signal)
Federated learning
Wireless networks
Computational modeling
Training data
Language
Abstract
Federated learning (FL) has recently emerged as a popular distributed learning paradigm since it allows collaborative training of a global machine learning model while keeping the training data of its participating workers locally. This paradigm enables the model training to harness the computing power across the network of FL and preserves the privacy of local training data. However, communication efficiency has become one of the major concerns of FL due to frequent model updates through the network, especially for devices in wireless networks that have limited communication resources. Despite that various communication-efficient compression mechanisms (e.g., quantization and sparsification) have been incorporated into FL, most of the existing studies are only concerned with resource allocation optimization given predetermined compression mechanisms, and few of them take wireless communication into consideration in the design of the compression mechanisms. In this paper, we study the impact of sparsification and wireless channels on FL performance. Specifically, we propose a channel-aware sparsification mechanism and derive a closed-form solution for communication time allocation for workers in a TDMA setting. Extensive simulations are conducted to validate the effectiveness of the proposed mechanism.