Residual Neural Networks for Digital Predistortion

12 May 2020  ·  Yibo Wu, Ulf Gustavsson, Alexandre Graell i Amat, Henk Wymeersch ·

Tracking the nonlinear behavior of an RF power amplifier (PA) is challenging. To tackle this problem, we build a connection between residual learning and the PA nonlinearity, and propose a novel residual neural network structure, referred to as the residual real-valued time-delay neural network (R2TDNN). Instead of learning the whole behavior of the PA, the R2TDNN focuses on learning its nonlinear behavior by adding identity shortcut connections between the input and output layer. In particular, we apply the R2TDNN to digital predistortion and measure experimental results on a real PA. Compared with neural networks recently proposed by Liu et al. and Wang et al., the R2TDNN achieves the best linearization performance in terms of normalized mean square error and adjacent channel power ratio with less or similar computational complexity. Furthermore, the R2TDNN exhibits significantly faster training speed and lower training error.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here