loading page

Deep Reinforcement Learning Based Algorithm for Symbiotic Radio IoT Throughput Optimization in 6G Network
  • +1
  • Samar Shaker Metwaly,
  • Gerges M. Salama,
  • E. G. Shehata,
  • Ahmed. M. Abd El-Haleem
Samar Shaker Metwaly
Helwan University Faculty of Engineering (Helwan)

Corresponding Author:samarshaker2904@h-eng.helwan.edu.eg

Author Profile
Gerges M. Salama
Minia University Faculty of Engineering
Author Profile
E. G. Shehata
Minia University Faculty of Engineering
Author Profile
Ahmed. M. Abd El-Haleem
Helwan University Faculty of Engineering (Helwan)
Author Profile

Abstract

Abstract Internet of Things (IoT) based 6G is expected to revolutionize our world. various candidate technologies have been proposed to meet the requirements of IoT systems based on 6G, symbiotic radio (SR) is one of these technologies. This paper aims to use symbiotic radio technology to support the passive internet of things and enhance the uplink transmission performance. In SR the IoT tag is parasitic on the primary transmission that the tags transmission shares not only the radio spectrum of the primary transmission, but also the power, and infrastructure of the neighbor smartphone primary system which enhances the spectrum and energy efficiency of the system. Then the IoT tags information is sent to cloud for analysis through the Macro base station MBS or the wireless access point WAP where the smart phones are used as a relay to transmit this information to the MBS or WAP. In this paper two optimization problems are formulated to maximize the total throughput of the system. First, a problem of achieving the optimum mode selection of LTE or Wi-Fi Network (MBS or WAP) by transmitting an expected tags information load from the smartphone to MBS or WAP aiming to maximize the system throughput. The matching game algorithm is used to solve this problem. Second, a problem of achieving optimum clustering of tags where the tags are divided into virtual clusters and finding which smartphones’ LTE/Wi-Fi downlink signal all cluster members can ride to maximize the system throughput. A deep Q-network (DDQN) model is proposed for solving this optimization problem with low complexity. Simulation results show that our proposed algorithms increase the total system data rate by average 90% above the system by using LTE network first and 20% above the system without using DDQL algorithm. Furthermore, our proposed algorithms enhance the capacity of the system on the average by 100% above system using LTE network first without DDQL algorithm.algorithm.
06 Dec 2022Submitted to Transactions on Emerging Telecommunications Technologies
06 Dec 2022Submission Checks Completed
06 Dec 2022Assigned to Editor
06 Dec 2022Review(s) Completed, Editorial Evaluation Pending
25 Dec 2022Reviewer(s) Assigned
02 Mar 2023Editorial Decision: Revise Major