Deep Learning Based Load Balancing for improved QoS towards 6G

Deep learning has made great strides lately with the availability of powerful computing machines and the advent of user-friendly programming environments. It is anticipated that the deep learning algorithms will entirely provision the majority of operations in 6G. One such environment where deep learning can be the right solution is load balancing in future 6G intelligent wireless networks. Load balancing presents an efficient, cost-effective method to improve the data process capability, throughput, and expand the bandwidth, thus enhancing the adaptability and availability of networks. Hence a load balancing algorithm based on Long Short Term Memory(LSTM) deep neural network is proposed through which the coverage area of base station changes according to geographic traffic distribution, catering the requirement for future generation 6G heterogeneous network. The LSTM model performance is evaluated by considering three different scenarios, and the results were presented. Load variance coefficient(LVC) and load factor(LF) are introduced and validated over two wireless network layouts(WNL) to study the Quality of Service(QoS) and load distribution. The proposed method shows a decrease of LVC by 98.311% and 99.21% for WNL1, WNL2 respectively.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods