학술논문

AFSD: Adaptive Feature Space Distillation for Distributed Deep Learning
Document Type
Periodical
Source
IEEE Access Access, IEEE. 10:84569-84578 2022
Subject
Aerospace
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Engineering Profession
Fields, Waves and Electromagnetics
General Topics for Engineers
Geoscience
Nuclear Engineering
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Convolutional neural networks
Feature extraction
Computational modeling
Deep learning
Adaptation models
Data models
Concurrent computing
Knowledge management
Distributed computing
Distributed deep learning
convolutional neural networks
knowledge distillation
codistillation
Language
ISSN
2169-3536
Abstract
We propose a novel and adaptive feature space distillation method (AFSD) to reduce the communication overhead among distributed computers. The proposed method improves the Codistillation process by supporting longer update interval rates. AFSD performs knowledge distillates across the models infrequently and provides flexibility to the models in terms of exploring diverse variations in the training process. We perform knowledge distillation in terms of sharing the feature space instead of output only. Therefore, we also propose a new loss function for the Codistillation technique in AFSD. Using the feature space leads to more efficient knowledge transfer between models with a longer update interval rates. In our method, the models can achieve the same accuracy as Allreduce and Codistillation with fewer epochs.