학술논문

A Load Balancing Algorithm for Equalising Latency Across Fog or Edge Computing Nodes
Document Type
Periodical
Source
IEEE Transactions on Services Computing IEEE Trans. Serv. Comput. Services Computing, IEEE Transactions on. 16(5):3129-3140 Jan, 2023
Subject
Computing and Processing
General Topics for Engineers
Load management
Mathematical models
Load modeling
Quality of service
Computational modeling
Edge computing
Task analysis
fog computing
load balancing
service latency
Language
ISSN
1939-1374
2372-0204
Abstract
When dealing with distributed applications in Edge or Fog computing environments, the service latency that the user experiences at a given node can be considered an indicator of how much the node itself is loaded with respect to the others. Indeed, only considering the average CPU time or the RAM utilisation, for example, does not give a clear depiction of the load situation because these parameters are application- and hardware-agnostic. They do not give any information about how the application is performing from the user's perspective, and they cannot be used for a QoS-oriented load balancing. In this article, we propose a load balancing algorithm that is focused on the service latency with the objective of levelling it across all the nodes in a fully decentralised manner. In this way, no user will experience a worse QoS than the other. By providing a differential model of the system and an adaptive heuristic to find the solution to the problem in real settings, we show both in simulation and in a real-world deployment, based on a cluster of Raspberry Pi boards, that our approach is able to level the service latency among a set of heterogeneous nodes organised in different topologies.