학술논문

Privacy-Preserving and Verifiable Decentralized Federated Learning
Document Type
Conference
Source
2023 5th International Conference on Energy, Power and Environment: Towards Flexible Green Energy Technologies (ICEPE) Energy, Power and Environment: Towards Flexible Green Energy Technologies (ICEPE), 2023 5th International Conference on. :1-6 Jun, 2023
Subject
Components, Circuits, Devices and Systems
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Training
Privacy
Data privacy
Federated learning
Scalability
Collaboration
Data models
decentralized training
con-fidentiality
variability
Language
ISSN
2832-8973
Abstract
Federated learning is a machine learning technique that allows multiple devices to collaboratively train a machine learning model without sharing their data with a central server. The data is kept on the local device, and the model is trained on the device itself. It also reduces the risk of data breaches and enhances privacy by keeping the data local to the device. It can improve the speed and efficiency of training large-scale models by distributing the workload across multiple devices. This paper proposes a solution to the security challenges faced by the federated learning framework, which allows global model construction without sharing raw data. Privacy-preserving and verifiable decentralized federated learning (PPVD-FL) is a framework designed for secure deep learning model training. It's a decentralized federated learning framework that pre-serves privacy and ensures verification. The framework uses an efficient and verifiable cipher-based matrix multiplication algorithm along with a suite of decentralized algorithms. It maintains the confidentiality of the global model and local updates and verifies every training step. PPVD-FL can protect privacy against various inference attacks and ensure training integrity, as shown by security analysis. Real-world experiments on datasets demonstrate its accuracy and practical performance.