학술논문

Delay-Aware and Energy-Efficient Carrier Aggregation in 5G Using Double Deep Q-Networks
Document Type
Periodical
Source
IEEE Transactions on Communications IEEE Trans. Commun. Communications, IEEE Transactions on. 70(10):6615-6629 Oct, 2022
Subject
Communication, Networking and Broadcast Technologies
Power demand
Resource management
Optimization
Quality of service
Throughput
Delays
5G mobile communication
5G
carrier aggregation
double deep Q-network
energy-efficiency
reinforcement learning
Language
ISSN
0090-6778
1558-0857
Abstract
As one of the key technologies in 5G networks, Carrier Aggregation (CA) is studied in this paper. In CA, Component Carriers (CCs) can be activated and deactivated depending on multiple factors, e.g., energy consumption and Quality of Service (QoS) demand of users. We propose CC management strategies where each User Equipment (UE) minimizes its average delay and at the same time minimizes its power consumption while considering that CCs can be activated and deactivated only at certain times, as in real-world CA implementations. We first model the problem as a centralized multi-objective optimum CC management problem. Since centralized approaches would impose a large overhead on the system, we then develop a semi-distributed solution by modeling the problem as a stochastic game and propose a multi-agent Double Deep Q-Network (DDQN) based CC management algorithm to solve the stochastic game. We finally compare the proposed approaches with single CC activation and all-CC activation baseline schemes. Simulation results show that our proposed algorithms outperform the all-CC algorithm in terms of UE power consumption and have the capability of transmitting a number of bits with delay close to the all-CC scheme. Meanwhile, our DDQN-based algorithm decreases the UE power consumption by about 20% with respect to the all-CC scheme.