학술논문

NQFL: Nonuniform Quantization for Communication Efficient Federated Learning
Document Type
Periodical
Source
IEEE Communications Letters IEEE Commun. Lett. Communications Letters, IEEE. 28(2):332-336 Feb, 2024
Subject
Communication, Networking and Broadcast Technologies
Servers
Quantization (signal)
Computational modeling
Data models
Standards
Costs
Iterative methods
Federated learning
communication efficiency
nonuniform quantizer
Lloyd-Max algorithm
Language
ISSN
1089-7798
1558-2558
2373-7891
Abstract
Federated learning (FL), as a potential machine learning framework for privacy preservation, has gained significant attention. However, the considerable communication overhead associated with FL remains a prominent challenge. To mitigate this issue, a nonuniform quantization scheme based on Lloyd-Max algorithm is introduced in this letter. By employing this approach, less communication resources are consumed to achieve the same performance. Through performance analysis and numerical simulations, we verify the convergence and effectiveness of the proposed algorithm. It demonstrates the potential of our approach in reducing communication overhead while maintaining reliable performance in FL systems.