학술논문

Towards Scalable Resilient Federated Learning: A Fully Decentralised Approach
Document Type
Conference
Source
2023 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops) Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops), 2023 IEEE International Conference on. :621-627 Mar, 2023
Subject
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
General Topics for Engineers
Robotics and Control Systems
Signal Processing and Analysis
Federated learning
Voting
Scalability
Computational modeling
Conferences
Computer architecture
Data models
decentralized learning
pervasive machine learning
edge AI
scalability
resilience
Language
ISSN
2766-8576
Abstract
Federated Learning (FL) collaboratively trains machine learning models on the data of local devices without having to move the data itself: a central server aggregates models, with privacy and performance benefits but also scalability and resilience challenges. In this paper we present FDFL, a new fully decentralized FL model and architecture that improves standard FL scalability and resilience with no loss of convergence speed. FDFL provides an aggregator-based model that enables scalability benefits and features an election process to tolerate node failures. Simulation results show that FDFL scales well with network size in terms of computing, memory, and communication compared to related FL approaches such as standard FL, FL with aggregators, or FL with election, with also good resilience to node failures.