학술논문

Gradient Coding With Dynamic Clustering for Straggler-Tolerant Distributed Learning
Document Type
Periodical
Source
IEEE Transactions on Communications IEEE Trans. Commun. Communications, IEEE Transactions on. 71(6):3317-3332 Jun, 2023
Subject
Communication, Networking and Broadcast Technologies
Encoding
Computational modeling
Redundancy
Servers
Codes
Machine learning
Task analysis
Distributed coded computation
gradient descent
straggler mitigation
gradient coding
clustering
Language
ISSN
0090-6778
1558-0857
Abstract
Distributed implementations are crucial in speeding up large scale machine learning applications. Distributed gradient descent (GD) is widely employed to parallelize the learning task by distributing the dataset across multiple workers. A significant performance bottleneck for the per-iteration completion time in distributed synchronous GD is straggling workers. Coded distributed computation techniques have been introduced recently to mitigate stragglers and to speed up GD iterations by assigning redundant computations to workers. In this paper, we introduce a novel paradigm of dynamic coded computation, which assigns redundant data to workers to acquire the flexibility to dynamically choose from among a set of possible codes depending on the past straggling behavior. In particular, we propose gradient coding (GC) with dynamic clustering, called GC-DC, and regulate the number of stragglers in each cluster by dynamically forming the clusters at each iteration. With time-correlated straggling behavior, GC-DC adapts to the straggling behavior over time; in particular, at each iteration, GC-DC aims at distributing the stragglers across clusters as uniformly as possible based on the past straggler behavior. For both homogeneous and heterogeneous worker models, we numerically show that GC-DC provides significant improvements in the average per-iteration completion time without an increase in the communication load compared to the original GC scheme.