Tensor N-tubal rank and its convex relaxation for low-rank tensor recovery

3 Dec 2018  ·  Yu-Bang Zheng, Ting-Zhu Huang, Xi-Le Zhao, Tai-Xiang Jiang, Teng-Yu Ji, Tian-Hui Ma ·

As low-rank modeling has achieved great success in tensor recovery, many research efforts devote to defining the tensor rank. Among them, the recent popular tensor tubal rank, defined based on the tensor singular value decomposition (t-SVD), obtains promising results. However, the framework of the t-SVD and the tensor tubal rank are applicable only to three-way tensors and lack of flexibility to handle different correlations along different modes. To tackle these two issues, we define a new tensor unfolding operator, named mode-$k_1k_2$ tensor unfolding, as the process of lexicographically stacking the mode-$k_1k_2$ slices of an $N$-way tensor into a three-way tensor, which is a three-way extension of the well-known mode-$k$ tensor matricization. Based on it, we define a novel tensor rank, the tensor $N$-tubal rank, as a vector whose elements contain the tubal rank of all mode-$k_1k_2$ unfolding tensors, to depict the correlations along different modes. To efficiently minimize the proposed $N$-tubal rank, we establish its convex relaxation: the weighted sum of tensor nuclear norm (WSTNN). Then, we apply WSTNN to low-rank tensor completion (LRTC) and tensor robust principal component analysis (TRPCA). The corresponding WSTNN-based LRTC and TRPCA models are proposed, and two efficient alternating direction method of multipliers (ADMM)-based algorithms are developed to solve the proposed models. Numerical experiments demonstrate that the proposed models significantly outperform the compared ones.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here