학술논문

Tackling Privacy Heterogeneity in Federated Learning
Document Type
Conference
Source
2023 21st International Symposium on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks (WiOpt) Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks (WiOpt), 2023 21st International Symposium on. :326-333 Aug, 2023
Subject
Communication, Networking and Broadcast Technologies
Computing and Processing
Training
Privacy
Data privacy
Systematics
Federated learning
Wireless networks
Convex functions
Differential privacy
federated learning
client selection
privacy heterogeneity
Language
ISSN
2690-3342
Abstract
Differentially private federated learning enables clients with privacy concerns to collaboratively train a model while preserving their privacy. Clients' locally available data and maximum tolerable privacy budgets will affect their contributions to the training performance. To date, existing studies focus on homogeneous privacy budgets and thus there lack systematic studies regarding the impact of clients' diverse privacy budgets (privacy heterogeneity). This paper represents the first step toward filling this gap. Through rigorous convergence analysis, we illustrate that the influence of privacy protection on training loss is affected by client selection probabilities. In addition, client selection and privacy protection together induce a non-vanishing training error in federated learning. Our analysis then shows that the non-vanishing training error is a convex function of client selection probabilities and thus allows us to formulate privacy-aware client selection as a convex optimization problem. Numerical results demonstrate that the privacy-aware client selection strategy can significantly improve learning performance. For example, compared with unbiased selection, the privacy-aware client selection strategy decreases the test loss by up to 63% on MNIST convolutional neural network classifier.