Stability of Low-Rank Tensor Representations and Structured Multilevel Preconditioning for Elliptic PDEs

25 Feb 2018  ·  Markus Bachmayr, Vladimir Kazeev ·

Folding grid value vectors of size $2^L$ into $L$th order tensors of mode sizes $2\times \cdots\times 2$, combined with low-rank representation in the tensor train format, has been shown to lead to highly efficient approximations for various classes of functions. These include solutions of elliptic PDEs on nonsmooth domains or with oscillatory data. This tensor-structured approach is attractive because it leads to highly compressed, adaptive approximations based on simple discretizations. Straightforward choices of the underlying basis, such as piecewise multilinear finite elements on uniform tensor product grids, lead to the well-known matrix ill-conditioning of discrete operators. We demonstrate that for low-rank representations, the use of tensor structure itself additionally leads to representation ill-conditioning, a new effect specific to computations in tensor networks. We analyze the tensor structure of a BPX preconditioner for second-order linear elliptic operators and construct an explicit tensor-structured representation of the preconditioner, with ranks independent of the number $L$ of discretization levels. Straightforward multiplication leads to a decomposition of the preconditioned discrete operator which still suffers from representation ill-conditioning. By additionally eliminating redundancies, we obtain a reduced-rank decomposition that is free of both matrix and representation ill-conditioning. For an iterative solver based on soft thresholding of low-rank tensors, we obtain convergence and complexity estimates and demonstrate its reliability and efficiency for discretizations with up to $2^{50}$ nodes in each dimension.

PDF Abstract