Search Results for author: Qibin Zhao

Found 35 papers, 12 papers with code

Fast Hypergraph Regularized Nonnegative Tensor Ring Factorization Based on Low-Rank Approximation

no code implementations6 Sep 2021 Xinhai Zhao, Yuyuan Yu, Guoxu Zhou, Qibin Zhao, Weijun Sun

For the high dimensional data representation, nonnegative tensor ring (NTR) decomposition equipped with manifold learning has become a promising model to exploit the multi-dimensional structure and extract the feature from tensor data.

CTFN: Hierarchical Learning for Multimodal Sentiment Analysis Using Coupled-Translation Fusion Network

no code implementations ACL 2021 Jiajia Tang, Kang Li, Xuanyu Jin, Andrzej Cichocki, Qibin Zhao, Wanzeng Kong

In this work, the coupled-translation fusion network (CTFN) is firstly proposed to model bi-direction interplay via couple learning, ensuring the robustness in respect to missing modalities.

Multimodal Sentiment Analysis

On the Memory Mechanism of Tensor-Power Recurrent Models

1 code implementation2 Mar 2021 Hejia Qiu, Chao Li, Ying Weng, Zhun Sun, Xingyu He, Qibin Zhao

Tensor-power (TP) recurrent model is a family of non-linear dynamical systems, of which the recurrence relation consists of a p-fold (a. k. a., degree-p) tensor product.

Learning from Incomplete Features by Simultaneous Training of Neural Networks and Sparse Coding

1 code implementation28 Nov 2020 Cesar F. Caiafa, Ziyao Wang, Jordi Solé-Casals, Qibin Zhao

A new supervised learning method is developed to train a general classifier, such as a logistic regression or a deep neural network, using only a subset of features per sample, while assuming sparse representations of data vectors on an unknown dictionary.

Imputation

Non-local Meets Global: An Iterative Paradigm for Hyperspectral Image Restoration

1 code implementation24 Oct 2020 wei he, Quanming Yao, Chao Li, Naoto Yokoya, Qibin Zhao, Hongyan zhang, Liangpei Zhang

Non-local low-rank tensor approximation has been developed as a state-of-the-art method for hyperspectral image (HSI) restoration, which includes the tasks of denoising, compressed HSI reconstruction and inpainting.

Denoising Image Restoration

Graph Regularized Nonnegative Tensor Ring Decomposition for Multiway Representation Learning

no code implementations12 Oct 2020 Yuyuan Yu, Guoxu Zhou, Ning Zheng, Shengli Xie, Qibin Zhao

Tensor ring (TR) decomposition is a powerful tool for exploiting the low-rank nature of multiway data and has demonstrated great potential in a variety of important applications.

Representation Learning

H-OWAN: Multi-distorted Image Restoration with Tensor 1x1 Convolution

no code implementations29 Jan 2020 Zihao Huang, Chao Li, Feng Duan, Qibin Zhao

It is a challenging task to restore images from their variants with combined distortions.

Image Restoration

Hyperspectral Super-Resolution via Coupled Tensor Ring Factorization

no code implementations6 Jan 2020 Wei He, Yong Chen, Naoto Yokoya, Chao Li, Qibin Zhao

In this paper, we propose a new model, named coupled tensor ring factorization (CTRF), for HSR.

Super-Resolution

Manifold Modeling in Embedded Space: A Perspective for Interpreting Deep Image Prior

1 code implementation8 Aug 2019 Tatsuya Yokota, Hidekata Hontani, Qibin Zhao, Andrzej Cichocki

The proposed approach is dividing the convolution into ``delay-embedding'' and ``transformation (\ie encoder-decoder)'', and proposing a simple, but essential, image/tensor modeling method which is closely related to dynamical systems and self-similarity.

Denoising Image Reconstruction +2

Learning Representations from Imperfect Time Series Data via Tensor Rank Regularization

no code implementations ACL 2019 Paul Pu Liang, Zhun Liu, Yao-Hung Hubert Tsai, Qibin Zhao, Ruslan Salakhutdinov, Louis-Philippe Morency

Our method is based on the observation that high-dimensional multimodal time series data often exhibit correlations across time and modalities which leads to low-rank tensor representations.

Question Answering Sentiment Analysis +2

TENSOR RING NETS ADAPTED DEEP MULTI-TASK LEARNING

no code implementations ICLR 2019 Xinqi Chen, Ming Hou, Guoxu Zhou, Qibin Zhao

Recent deep multi-task learning (MTL) has been witnessed its success in alleviating data scarcity of some task by utilizing domain-specific knowledge from related tasks.

Multi-Task Learning

Tensor-Ring Nuclear Norm Minimization and Application for Visual Data Completion

no code implementations21 Mar 2019 Jinshi Yu, Chao Li, Qibin Zhao, Guoxu Zhou

Tensor ring (TR) decomposition has been successfully used to obtain the state-of-the-art performance in the visual data completion problem.

Compression and Interpretability of Deep Neural Networks via Tucker Tensor Layer: From First Principles to Tensor Valued Back-Propagation

no code implementations14 Mar 2019 Giuseppe G. Calvi, Ahmad Moniri, Mahmoud Mahfouz, Qibin Zhao, Danilo P. Mandic

This is achieved through a tensor valued approach, based on the proposed Tucker Tensor Layer (TTL), as an alternative to the dense weight-matrices of DNNs.

Non-local Meets Global: An Integrated Paradigm for Hyperspectral Denoising

2 code implementations CVPR 2019 Wei He, Quanming Yao, Chao Li, Naoto Yokoya, Qibin Zhao

This is done by first learning a low-dimensional projection and the related reduced image from the noisy HSI.

Denoising

Low-Rank Embedding of Kernels in Convolutional Neural Networks under Random Shuffling

no code implementations31 Oct 2018 Chao Li, Zhun Sun, Jinshi Yu, Ming Hou, Qibin Zhao

We demonstrate this by compressing the convolutional layers via randomly-shuffled tensor decomposition (RsTD) for a standard classification task using CIFAR-10.

General Classification Tensor Decomposition

Tensor Ring Decomposition with Rank Minimization on Latent Space: An Efficient Approach for Tensor Completion

1 code implementation7 Sep 2018 Longhao Yuan, Chao Li, Danilo Mandic, Jianting Cao, Qibin Zhao

In this paper, by exploiting the low-rank structure of the TR latent space, we propose a novel tensor completion method which is robust to model selection.

Model Selection Tensor Decomposition

Brain-Computer Interface with Corrupted EEG Data: A Tensor Completion Approach

no code implementations13 Jun 2018 Jordi Sole-Casals, Cesar F. Caiafa, Qibin Zhao, Adrzej Cichocki

For the random missing channels case, we show that tensor completion algorithms help to reconstruct missing channels, significantly improving the accuracy in the classification of motor imagery, however, not at the same level as clean data.

Classification EEG +2

Rank Minimization on Tensor Ring: A New Paradigm in Scalable Tensor Decomposition and Completion

no code implementations22 May 2018 Longhao Yuan, Chao Li, Danilo Mandic, Jianting Cao, Qibin Zhao

In low-rank tensor completion tasks, due to the underlying multiple large-scale singular value decomposition (SVD) operations and rank selection problem of the traditional methods, they suffer from high computational cost and high sensitivity of model complexity.

Tensor Decomposition

Beyond Unfolding: Exact Recovery of Latent Convex Tensor Decomposition under Reshuffling

no code implementations22 May 2018 Chao Li, Mohammad Emtiyaz Khan, Zhun Sun, Gang Niu, Bo Han, Shengli Xie, Qibin Zhao

Exact recovery of tensor decomposition (TD) methods is a desirable property in both unsupervised learning and scientific data analysis.

Image Steganography Tensor Decomposition

High-dimension Tensor Completion via Gradient-based Optimization Under Tensor-train Format

1 code implementation5 Apr 2018 Longhao Yuan, Qibin Zhao, Lihua Gui, Jianting Cao

We propose two TT-based algorithms: Tensor Train Weighted Optimization (TT-WOPT) and Tensor Train Stochastic Gradient Descent (TT-SGD) to optimize TT decomposition factors.

Generative Adversarial Positive-Unlabelled Learning

no code implementations21 Nov 2017 Ming Hou, Brahim Chaib-Draa, Chao Li, Qibin Zhao

However, given limited P data, the conventional PU models tend to suffer from overfitting when adapted to very flexible deep neural networks.

Tensorizing Generative Adversarial Nets

1 code implementation30 Oct 2017 Xingwei Cao, Xuyang Zhao, Qibin Zhao

Generative Adversarial Network (GAN) and its variants exhibit state-of-the-art performance in the class of generative models.

Completion of High Order Tensor Data with Missing Entries via Tensor-train Decomposition

1 code implementation8 Sep 2017 Longhao Yuan, Qibin Zhao, Jianting Cao

In this paper, we aim at the completion problem of high order tensor data with missing entries.

Tensor Ring Decomposition

no code implementations17 Jun 2016 Qibin Zhao, Guoxu Zhou, Shengli Xie, Liqing Zhang, Andrzej Cichocki

In this paper, we introduce a fundamental tensor decomposition model to represent a large dimensional tensor by a circular multilinear products over a sequence of low dimensional cores, which can be graphically interpreted as a cyclic interconnection of 3rd-order tensors, and thus termed as tensor ring (TR) decomposition.

Tensor Decomposition Tensor Networks

Linked Component Analysis from Matrices to High Order Tensors: Applications to Biomedical Data

no code implementations29 Aug 2015 Guoxu Zhou, Qibin Zhao, Yu Zhang, Tülay Adalı, Shengli Xie, Andrzej Cichocki

With the increasing availability of various sensor technologies, we now have access to large amounts of multi-block (also called multi-set, multi-relational, or multi-view) data that need to be jointly analyzed to explore their latent connections.

Tensor Decomposition

Smooth PARAFAC Decomposition for Tensor Completion

no code implementations25 May 2015 Tatsuya Yokota, Qibin Zhao, Andrzej Cichocki

The proposed method admits significant advantages, owing to the integration of smooth PARAFAC decomposition for incomplete tensors and the efficient selection of models in order to minimize the tensor rank.

Matrix Completion

Bayesian Sparse Tucker Models for Dimension Reduction and Tensor Completion

1 code implementation10 May 2015 Qibin Zhao, Liqing Zhang, Andrzej Cichocki

Tucker decomposition is the cornerstone of modern machine learning on tensorial data analysis, which have attracted considerable attention for multiway feature extraction, compressive sensing, and tensor completion.

Compressive Sensing Dimensionality Reduction +1

Bayesian Robust Tensor Factorization for Incomplete Multiway Data

no code implementations9 Oct 2014 Qibin Zhao, Guoxu Zhou, Liqing Zhang, Andrzej Cichocki, Shun-ichi Amari

We propose a generative model for robust tensor factorization in the presence of both missing data and outliers.

Model Selection Variational Inference

Efficient Nonnegative Tucker Decompositions: Algorithms and Uniqueness

no code implementations17 Apr 2014 Guoxu Zhou, Andrzej Cichocki, Qibin Zhao, Shengli Xie

Nonnegative Tucker decomposition (NTD) is a powerful tool for the extraction of nonnegative parts-based and physically meaningful latent components from high-dimensional tensor data while preserving the natural multilinear structure of data.

Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination

1 code implementation25 Jan 2014 Qibin Zhao, Liqing Zhang, Andrzej Cichocki

CANDECOMP/PARAFAC (CP) tensor factorization of incomplete data is a powerful technique for tensor completion through explicitly capturing the multilinear latent factors.

Bayesian Inference Image Inpainting

Higher-Order Partial Least Squares (HOPLS): A Generalized Multi-Linear Regression Method

1 code implementation5 Jul 2012 Qibin Zhao, Cesar F. Caiafa, Danilo P. Mandic, Zenas C. Chao, Yasuo Nagasaka, Naotaka Fujii, Liqing Zhang, Andrzej Cichocki

A new generalized multilinear regression model, termed the Higher-Order Partial Least Squares (HOPLS), is introduced with the aim to predict a tensor (multiway array) $\tensor{Y}$ from a tensor $\tensor{X}$ through projecting the data onto the latent space and performing regression on the corresponding latent variables.

Cannot find the paper you are looking for? You can Submit a new open access paper.