학술논문

A Performance Efficient Approach of Global Training in Federated Learning
Document Type
Conference
Source
2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC) Artificial Intelligence in Information and Communication (ICAIIC), 2023 International Conference on. :112-115 Feb, 2023
Subject
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Fields, Waves and Electromagnetics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Training
Privacy
Data privacy
Costs
Federated learning
Distributed databases
Learning (artificial intelligence)
heterogeneous networks
deep learning
Language
ISSN
2831-6983
Abstract
Federated learning is a novel approach of training the global model on the server by utilizing the personal data of the end users while data privacy is preserved. The users called clients are required to perform the local training using their local datasets and forward those trained local models to the server, in which the local models are aggregated to update the global model. This process of global training is carried out for several rounds until the convergence. Practically, the clients' data is non-independent and identically distributed (Non-IID). Hence, the updated local model of each client may vary from every other client due to heterogeneity among them. Hence, the process of aggregating the diversified local models of clients has a huge impact on the performance of global training. This article proposes a performance efficient aggregation approach for federated learning, which considers the data heterogeneity among clients before aggregating the received local models. The proposed approach is compared with the conventional federated learning methods, and it achieves improved performance.