Recurrent Neural Network Assisted Transmitter Selection for Secrecy in Cognitive Radio Network

16 Feb 2021  ·  Shalini Tripathi, Chinmoy Kundu, Octavia A. Dobre, Ankur Bansal, Mark F. Flanagan ·

In this paper, we apply the long short-term memory (LSTM), an advanced recurrent neural network based machine learning (ML) technique, to the problem of transmitter selection (TS) for secrecy in an underlay small-cell cognitive radio network with unreliable backhaul connections. The cognitive communication scenario under consideration has a secondary small-cell network that shares the same spectrum of the primary network with an agreement to always maintain a desired outage probability constraint in the primary network. Due to the interference from the secondary transmitter common to all primary transmissions, the secrecy rates for the different transmitters are correlated. LSTM exploits this correlation and matches the performance of the conventional technique when the number of transmitters is small. As the number grows, the performance degrades in the same manner as other ML techniques such as support vector machine, $k$-nearest neighbors, naive Bayes, and deep neural network. However, LSTM still significantly outperforms these techniques in misclassification ratio and secrecy outage probability. It also reduces the feedback overhead against conventional TS.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods