RISnet: a Dedicated Scalable Neural Network Architecture for Optimization of Reconfigurable Intelligent Surfaces

6 Dec 2022  ·  Bile Peng, Finn Siegismund-Poschmann, Eduard A. Jorswieck ·

The reconfigurable intelligent surface (RIS) is a promising technology for next-generation wireless communication. It comprises many passive antennas, which reflect signals from the transmitter to the receiver with adjusted phases without changing the amplitude. The large number of the antennas enables a huge potential of signal processing despite the simple functionality of a single antenna. However, it also makes the RIS configuration a high dimensional problem, which might not have a closed-form solution and has a high complexity and, as a result, severe difficulty in online real-time application if we apply iterative numerical solutions. In this paper, we introduce a machine learning approach to maximize the weighted sum-rate (WSR). We propose a dedicated neural network architecture called RISNet. The RIS optimization is designed according to the RIS property of product and direct channel and homogeneous RIS antennas. The architecture is scalable due to the fact that the number of trainable parameters is independent from the number of RIS antennas (because all antennas share the same parameters). The weighted minimum mean squared error (WMMSE) precoding is applied and an alternating optimization (AO) training procedure is designed. Testing results show that the proposed approach outperforms the state-of-the-art block coordinate descent (BCD) algorithm. Moreover, although the training takes several hours, online testing with trained model (application) is almost instant, which makes it feasible for real-time application. Compared to it, the BCD algorithm requires much more convergence time. Therefore, the proposed method outperforms the state-of-the-art algorithm in both performance and complexity.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here