DRL-Based QoS-Aware Resource Allocation Scheme for Coexistence of Licensed and Unlicensed Users in LTE and Beyond

In this paper, we employ deep reinforcement learning to develop a novel radio resource allocation and packet scheduling scheme for different Quality of Service (QoS) requirements applicable to LTEadvanced and 5G networks. In addition, regarding the scarcity of spectrum in below 6GHz bands, the proposed algorithm dynamically allocates the resource blocks (RBs) to licensed users in a way to mostly preserve the continuity of unallocated RBs. This would improve the efficiency of communication among the unlicensed entities by increasing the chance of uninterrupted communication and reducing the load of coordination overheads. The optimization problem is formulated as a Markov Decision Process (MDP), observing the entire queue of the demands, where failing to meet QoS constraints penalizes the goal with a multiplicative factor. Furthermore, a notion of continuity for unallocated resources is taken into account as an additive term in the objective function. Considering the variations in both channel coefficients and users requests, we utilize a deep reinforcement learning algorithm as an online and numerically efficient approach to solve the MDP. Numerical results show that the proposed method achieves higher average spectral efficiency, while considering delay budget and packet loss ratio, compared to the conventional greedy min-delay and max-throughput schemes, in which a fixed part of the spectrum is forced to be vacant for unlicensed entities.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here