학술논문

Inference Time Optimization Using BranchyNet Partitioning
Document Type
Conference
Source
2020 IEEE Symposium on Computers and Communications (ISCC) Computers and Communications (ISCC), 2020 IEEE Symposium on. :1-6 Jul, 2020
Subject
Communication, Networking and Broadcast Technologies
Computing and Processing
Fields, Waves and Electromagnetics
Photonics and Electrooptics
Robotics and Control Systems
Signal Processing and Analysis
Shortest path problem
Sensitivity analysis
Delays
Topology
Partitioning algorithms
Servers
Proposals
Language
ISSN
2642-7389
Abstract
Deep Neural Network (DNN) inference requires high computation power, which generally involves a cloud infrastructure. However, sending raw data to the cloud can increase the inference time due to the communication delay. To reduce this delay, the first DNN layers can be executed at an edge infrastructure and the remaining ones at the cloud. Depending on which layers are processed at the edge, the amount of data can be highly reduced. However, executing layers at the edge can increase the processing delay. A partitioning problem tries to address this trade-off, choosing the set of layers to be executed at the edge to minimize the inference time. In this work, we address the problem of partitioning a BranchyNet, which is a DNN type where the inference can stop at the middle layers. We show that this partitioning can be treated as the shortest path problem, and thus solved in polynomial time.