Convolutional Polar Codes on Channels with Memory using Tensor Networks

Arikan’s recursive code construction is designed to polarize a collection of memoryless channels into a set of good and a set of bad channels, and it can be efficiently decoded using successive cancellation [1]. It was recently shown that the same construction also polarizes channels with memory [2], and a generalization of successive cancellation decoder was proposed with a complexity that scales like the third power of the channel’s memory size [3]. In another line of work, the polar code construction was extended by replacing the block polarization kernel by a convoluted kernel [4]. Here, we present an efficient decoding algorithm for finite-state memory channels that can be applied to polar codes and convolutional polar codes. This generalization is most effectively described using the tensor network formalism, and the manuscript presents a self-contained description of the required basic concepts. We use numerical simulations to study the performance of these algorithms for practically relevant code sizes and find that the convolutional structure outperforms the standard polar codes on a variety of channels with memory.

PDF
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here