no code implementations • 12 Dec 2022 • Sijia Xia, Duo Qiu, Xiongjun Zhang
The main advantage of transformed tensor-tensor product is that its computational complexity is lower compared with the existing literature based on transformed tensor nuclear norm.
no code implementations • 17 Aug 2022 • Xiongjun Zhang, Michael K. Ng
In this paper, we propose a sparse nonnegative Tucker decomposition and completion method for the recovery of underlying nonnegative data under noisy observations.
no code implementations • 16 Dec 2020 • Guang-Jing Song, Michael K. Ng, Xiongjun Zhang
The main aim of this paper is to study $n_1 \times n_2 \times n_3$ third-order tensor completion based on transformed tensor singular value decomposition, and provide a bound on the number of required sample entries.
no code implementations • 2 Dec 2020 • Zhebin Wu, Tianchi Liao, Chuan Chen, Cong Liu, Zibin Zheng, Xiongjun Zhang
On the contrary, in the field of signal processing, Convolutional Sparse Coding (CSC) can provide a good representation of the high-frequency component of the image, which is generally associated with the detail component of the data.
no code implementations • 21 Jul 2020 • Xiongjun Zhang, Michael K. Ng
We propose to minimize the sum of the maximum likelihood estimation for the observations with nonnegativity constraints and the tensor $\ell_0$ norm for the sparse factor.
no code implementations • 21 Oct 2019 • Junjun Pan, Michael K. Ng, Ye Liu, Xiongjun Zhang, Hong Yan
In this paper, we study the nonnegative tensor data and propose an orthogonal nonnegative Tucker decomposition (ONTD).
no code implementations • 2 Jul 2019 • Guangjing Song, Michael K. Ng, Xiongjun Zhang
In this paper, we study robust tensor completion by using transformed tensor singular value decomposition (SVD), which employs unitary transform matrices instead of discrete Fourier transform matrix that is used in the traditional tensor SVD.