학술논문

HyperFeel: An Efficient Federated Learning Framework Using Hyperdimensional Computing
Document Type
Conference
Source
2024 29th Asia and South Pacific Design Automation Conference (ASP-DAC) Design Automation Conference (ASP-DAC), 2024 29th Asia and South Pacific. :716-721 Jan, 2024
Subject
Components, Circuits, Devices and Systems
Training
Degradation
Adaptation models
Federated learning
Computational modeling
Neural networks
Data models
Language
ISSN
2153-697X
Abstract
Federated Learning (FL) aims to establish a shared model across decentralized clients under the privacy-preserving constraint. Each client learns an independent model with local data, and only model’s updates are communicated. However, since the FL model typically employs computation-intensive neural networks, major challenges in Federated Learning are (i) significant computation overhead for local training; (ii) massive communication overhead arises from the model updates; (iii) notable performance degradation caused by the non-IID scenario. In this work, we propose HyperFeel, an efficient framework for federated learning based on Hyper-Dimensional Computing (HDC), that can significantly improve communication/storage efficiency over existing works with nearly no performance degradation. Unlike current solutions that employ neural networks as the learning models, HyperFeel introduces a simple yet effective computing paradigm using hyperdimensional vectors to encode and represent data. It performs concise and highly parallel operations for encryption, computation, and communication, taking advantage of the lightweight feature representation of hyperdimensional vectors. To further enhance HyperFeel performance, we propose a two-fold optimization scheme combining the characteristics of encoding and updating in hyper-dimensional computing. On one hand, we design a personalized update strategy for client models based on HDC, which achieves better accuracy on non-IID data. On the other hand, we extend the framework from horizontal FL to vertical FL based on a shared encoding mechanism. Comprehensive experimental results demonstrate that our method consistently outperforms the state-of-the-art FL models. HyperFeel achieves $26 \times $ storage reduction and up to $81 \times $ communication reduction over FedAvg, with minimal accuracy drops on FEMNIST and Synthetic.