학술논문

Heuristic-Based Proactive Service Migration Induced by Dynamic Computation Load in Edge Computing
Document Type
Conference
Source
GLOBECOM 2022 - 2022 IEEE Global Communications Conference Global Communications Conference(48099), GLOBECOM 2022 - 2022 IEEE. :5668-5673 Dec, 2022
Subject
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Engineering Profession
General Topics for Engineers
Power, Energy and Industry Applications
Signal Processing and Analysis
Performance evaluation
Heuristic algorithms
Performance gain
Internet of Things
Time factors
Reliability
Global communication
Edge computing
migration decision latency
MTHG heuristic algorithm
proactive migration
Language
Abstract
Edge Computing (EC) has paved the way toward the realization of the Internet of Things (IoT). This can be attributed to the ability of EC to bring the computational resources within close proximity to end-users, which significantly improves the response time. However, performance gain in EC can be compromised by service interruptions triggered by various dynamic changes. Consequently, reliable service migration is crucial in EC. However, most service migration schemes either fail to consider the profound impact of the dynamic computation load on service continuity or provide impractical and time-inefficient solutions based on optimization techniques. This paper proposes the Heuristic-based Load-induced Proactive Migration (HLPM) scheme. HLPM incorporates a Finite State Machine (FSM) to model the dynamic computation load. It then makes proactive migration decisions based on the underlying transition probabilities. The proactive migration problem is solved using the MTHG heuristic algorithm. Performance evaluation shows that HLPM produces a significant decrease of up to 97% in migration decision latency compared to conventional optimization techniques. Furthermore, the performance gap of HLPM with respect to the optimal migration solution is just 1.44% latency and 3.89% number of migrations.