학술논문

Evaluating the Performance of Federated Learning Across Different Training Sample Distributions
Document Type
Conference
Source
2024 18th International Conference on Ubiquitous Information Management and Communication (IMCOM) Ubiquitous Information Management and Communication (IMCOM), 2024 18th International Conference on. :1-6 Jan, 2024
Subject
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Robotics and Control Systems
Signal Processing and Analysis
Training
Federated learning
Neural networks
Organizations
Information management
Resource management
Image classification
sample distribution
deep neural networks
Language
Abstract
This research investigates how the distribution of samples impacts the performance of federated learning. By simulating datasets with independent identical distribution (IID) and nonindependent identical distribution (non-IID), and varying the number of collaborating units, we observe how differences in training sample distribution affect the effectiveness of federated learning. Specifically, we discuss the special situation of nonintersecting classes in the case of non-independent identical distribution. Using deep learning methods with both pretrained and trained-from-scratch models, this study comprehensively discusses the impact of the number and distribution of units and evaluates the results of joint training based on Top-1 and Top-5 accuracy. Experimental results show that the initial weight setting of joint training has a critical impact. Random weights lead to unstable model performance, while weights set based on the same criteria yield stable and more accurate results. Additionally, model performance varies depending on characteristics of data distribution. The performance of federated-learning model trained with independent identical distribution samples is the best, followed by imbalanced distribution in non-independent identical distribution, while non-intersecting class allocation is the least ideal.