Search Results for author: Qibin Zhao

Found 59 papers, 19 papers with code

Tensor Ring Decomposition

1 code implementation17 Jun 2016 Qibin Zhao, Guoxu Zhou, Shengli Xie, Liqing Zhang, Andrzej Cichocki

In this paper, we introduce a fundamental tensor decomposition model to represent a large dimensional tensor by a circular multilinear products over a sequence of low dimensional cores, which can be graphically interpreted as a cyclic interconnection of 3rd-order tensors, and thus termed as tensor ring (TR) decomposition.

Tensor Decomposition Tensor Networks

Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination

1 code implementation25 Jan 2014 Qibin Zhao, Liqing Zhang, Andrzej Cichocki

CANDECOMP/PARAFAC (CP) tensor factorization of incomplete data is a powerful technique for tensor completion through explicitly capturing the multilinear latent factors.

Bayesian Inference Image Inpainting

SPD domain-specific batch normalization to crack interpretable unsupervised domain adaptation in EEG

1 code implementation2 Jun 2022 Reinmar J Kobler, Jun-Ichiro Hirayama, Qibin Zhao, Motoaki Kawanabe

To achieve this, we propose a new building block for geometric deep learning, which we denote SPD domain-specific momentum batch normalization (SPDDSMBN).

Brain Computer Interface EEG +3

Non-local Meets Global: An Iterative Paradigm for Hyperspectral Image Restoration

1 code implementation24 Oct 2020 wei he, Quanming Yao, Chao Li, Naoto Yokoya, Qibin Zhao, Hongyan zhang, Liangpei Zhang

Non-local low-rank tensor approximation has been developed as a state-of-the-art method for hyperspectral image (HSI) restoration, which includes the tasks of denoising, compressed HSI reconstruction and inpainting.

Denoising Image Restoration

Bayesian Sparse Tucker Models for Dimension Reduction and Tensor Completion

1 code implementation10 May 2015 Qibin Zhao, Liqing Zhang, Andrzej Cichocki

Tucker decomposition is the cornerstone of modern machine learning on tensorial data analysis, which have attracted considerable attention for multiway feature extraction, compressive sensing, and tensor completion.

Compressive Sensing Dimensionality Reduction +1

Tensorizing Generative Adversarial Nets

1 code implementation30 Oct 2017 Xingwei Cao, Xuyang Zhao, Qibin Zhao

Generative Adversarial Network (GAN) and its variants exhibit state-of-the-art performance in the class of generative models.

Generative Adversarial Network

High-dimension Tensor Completion via Gradient-based Optimization Under Tensor-train Format

1 code implementation5 Apr 2018 Longhao Yuan, Qibin Zhao, Lihua Gui, Jianting Cao

We propose two TT-based algorithms: Tensor Train Weighted Optimization (TT-WOPT) and Tensor Train Stochastic Gradient Descent (TT-SGD) to optimize TT decomposition factors.

Vocal Bursts Intensity Prediction

Completion of High Order Tensor Data with Missing Entries via Tensor-train Decomposition

1 code implementation8 Sep 2017 Longhao Yuan, Qibin Zhao, Jianting Cao

In this paper, we aim at the completion problem of high order tensor data with missing entries.

Higher-Order Partial Least Squares (HOPLS): A Generalized Multi-Linear Regression Method

1 code implementation5 Jul 2012 Qibin Zhao, Cesar F. Caiafa, Danilo P. Mandic, Zenas C. Chao, Yasuo Nagasaka, Naotaka Fujii, Liqing Zhang, Andrzej Cichocki

A new generalized multilinear regression model, termed the Higher-Order Partial Least Squares (HOPLS), is introduced with the aim to predict a tensor (multiway array) $\tensor{Y}$ from a tensor $\tensor{X}$ through projecting the data onto the latent space and performing regression on the corresponding latent variables.

regression

Manifold Modeling in Embedded Space: A Perspective for Interpreting Deep Image Prior

1 code implementation8 Aug 2019 Tatsuya Yokota, Hidekata Hontani, Qibin Zhao, Andrzej Cichocki

The proposed approach is dividing the convolution into ``delay-embedding'' and ``transformation (\ie encoder-decoder)'', and proposing a simple, but essential, image/tensor modeling method which is closely related to dynamical systems and self-similarity.

Denoising Image Reconstruction +2

CTFN: Hierarchical Learning for Multimodal Sentiment Analysis Using Coupled-Translation Fusion Network

1 code implementation ACL 2021 Jiajia Tang, Kang Li, Xuanyu Jin, Andrzej Cichocki, Qibin Zhao, Wanzeng Kong

In this work, the coupled-translation fusion network (CTFN) is firstly proposed to model bi-direction interplay via couple learning, ensuring the robustness in respect to missing modalities.

Multimodal Sentiment Analysis Translation

Learning from Incomplete Features by Simultaneous Training of Neural Networks and Sparse Coding

1 code implementation28 Nov 2020 Cesar F. Caiafa, Ziyao Wang, Jordi Solé-Casals, Qibin Zhao

A new supervised learning method is developed to train a general classifier, such as a logistic regression or a deep neural network, using only a subset of features per sample, while assuming sparse representations of data vectors on an unknown dictionary.

Imputation

Permutation Search of Tensor Network Structures via Local Sampling

1 code implementation14 Jun 2022 Chao Li, Junhua Zeng, Zerui Tao, Qibin Zhao

Recent works put much effort into tensor network structure search (TN-SS), aiming to select suitable tensor network (TN) structures, involving the TN-ranks, formats, and so on, for the decomposition or learning tasks.

Toward Understanding Convolutional Neural Networks from Volterra Convolution Perspective

1 code implementation19 Oct 2021 Tenghui Li, Guoxu Zhou, Yuning Qiu, Qibin Zhao

We make an attempt to understanding convolutional neural network by exploring the relationship between (deep) convolutional neural networks and Volterra convolutions.

On the Memory Mechanism of Tensor-Power Recurrent Models

1 code implementation2 Mar 2021 Hejia Qiu, Chao Li, Ying Weng, Zhun Sun, Xingyu He, Qibin Zhao

Tensor-power (TP) recurrent model is a family of non-linear dynamical systems, of which the recurrence relation consists of a p-fold (a. k. a., degree-p) tensor product.

Alternating Local Enumeration (TnALE): Solving Tensor Network Structure Search with Fewer Evaluations

1 code implementation25 Apr 2023 Chao Li, Junhua Zeng, Chunmei Li, Cesar Caiafa, Qibin Zhao

Tensor network (TN) is a powerful framework in machine learning, but selecting a good TN model, known as TN structure search (TN-SS), is a challenging and computationally intensive task.

Computational Efficiency

Brain-Computer Interface with Corrupted EEG Data: A Tensor Completion Approach

no code implementations13 Jun 2018 Jordi Sole-Casals, Cesar F. Caiafa, Qibin Zhao, Adrzej Cichocki

For the random missing channels case, we show that tensor completion algorithms help to reconstruct missing channels, significantly improving the accuracy in the classification of motor imagery, however, not at the same level as clean data.

Brain Computer Interface Classification +4

Rank Minimization on Tensor Ring: A New Paradigm in Scalable Tensor Decomposition and Completion

no code implementations22 May 2018 Longhao Yuan, Chao Li, Danilo Mandic, Jianting Cao, Qibin Zhao

In low-rank tensor completion tasks, due to the underlying multiple large-scale singular value decomposition (SVD) operations and rank selection problem of the traditional methods, they suffer from high computational cost and high sensitivity of model complexity.

Tensor Decomposition

Beyond Unfolding: Exact Recovery of Latent Convex Tensor Decomposition under Reshuffling

no code implementations22 May 2018 Chao Li, Mohammad Emtiyaz Khan, Zhun Sun, Gang Niu, Bo Han, Shengli Xie, Qibin Zhao

Exact recovery of tensor decomposition (TD) methods is a desirable property in both unsupervised learning and scientific data analysis.

Image Steganography Tensor Decomposition

Generative Adversarial Positive-Unlabelled Learning

no code implementations21 Nov 2017 Ming Hou, Brahim Chaib-Draa, Chao Li, Qibin Zhao

However, given limited P data, the conventional PU models tend to suffer from overfitting when adapted to very flexible deep neural networks.

Smooth PARAFAC Decomposition for Tensor Completion

no code implementations25 May 2015 Tatsuya Yokota, Qibin Zhao, Andrzej Cichocki

The proposed method admits significant advantages, owing to the integration of smooth PARAFAC decomposition for incomplete tensors and the efficient selection of models in order to minimize the tensor rank.

Matrix Completion

Efficient Nonnegative Tucker Decompositions: Algorithms and Uniqueness

no code implementations17 Apr 2014 Guoxu Zhou, Andrzej Cichocki, Qibin Zhao, Shengli Xie

Nonnegative Tucker decomposition (NTD) is a powerful tool for the extraction of nonnegative parts-based and physically meaningful latent components from high-dimensional tensor data while preserving the natural multilinear structure of data.

Linked Component Analysis from Matrices to High Order Tensors: Applications to Biomedical Data

no code implementations29 Aug 2015 Guoxu Zhou, Qibin Zhao, Yu Zhang, Tülay Adalı, Shengli Xie, Andrzej Cichocki

With the increasing availability of various sensor technologies, we now have access to large amounts of multi-block (also called multi-set, multi-relational, or multi-view) data that need to be jointly analyzed to explore their latent connections.

Tensor Decomposition

Bayesian Robust Tensor Factorization for Incomplete Multiway Data

no code implementations9 Oct 2014 Qibin Zhao, Guoxu Zhou, Liqing Zhang, Andrzej Cichocki, Shun-ichi Amari

We propose a generative model for robust tensor factorization in the presence of both missing data and outliers.

Model Selection Variational Inference

Tensor Ring Decomposition with Rank Minimization on Latent Space: An Efficient Approach for Tensor Completion

no code implementations7 Sep 2018 Longhao Yuan, Chao Li, Danilo Mandic, Jianting Cao, Qibin Zhao

In this paper, by exploiting the low-rank structure of the TR latent space, we propose a novel tensor completion method which is robust to model selection.

Model Selection Tensor Decomposition

Low-Rank Embedding of Kernels in Convolutional Neural Networks under Random Shuffling

no code implementations31 Oct 2018 Chao Li, Zhun Sun, Jinshi Yu, Ming Hou, Qibin Zhao

We demonstrate this by compressing the convolutional layers via randomly-shuffled tensor decomposition (RsTD) for a standard classification task using CIFAR-10.

General Classification Tensor Decomposition

TENSOR RING NETS ADAPTED DEEP MULTI-TASK LEARNING

no code implementations ICLR 2019 Xinqi Chen, Ming Hou, Guoxu Zhou, Qibin Zhao

Recent deep multi-task learning (MTL) has been witnessed its success in alleviating data scarcity of some task by utilizing domain-specific knowledge from related tasks.

Multi-Task Learning

Compression and Interpretability of Deep Neural Networks via Tucker Tensor Layer: From First Principles to Tensor Valued Back-Propagation

no code implementations14 Mar 2019 Giuseppe G. Calvi, Ahmad Moniri, Mahmoud Mahfouz, Qibin Zhao, Danilo P. Mandic

This is achieved through a tensor valued approach, based on the proposed Tucker Tensor Layer (TTL), as an alternative to the dense weight-matrices of DNNs.

Tensor-Ring Nuclear Norm Minimization and Application for Visual Data Completion

no code implementations21 Mar 2019 Jinshi Yu, Chao Li, Qibin Zhao, Guoxu Zhou

Tensor ring (TR) decomposition has been successfully used to obtain the state-of-the-art performance in the visual data completion problem.

Learning Representations from Imperfect Time Series Data via Tensor Rank Regularization

no code implementations ACL 2019 Paul Pu Liang, Zhun Liu, Yao-Hung Hubert Tsai, Qibin Zhao, Ruslan Salakhutdinov, Louis-Philippe Morency

Our method is based on the observation that high-dimensional multimodal time series data often exhibit correlations across time and modalities which leads to low-rank tensor representations.

Question Answering Sentiment Analysis +4

Hyperspectral Super-Resolution via Coupled Tensor Ring Factorization

no code implementations6 Jan 2020 Wei He, Yong Chen, Naoto Yokoya, Chao Li, Qibin Zhao

In this paper, we propose a new model, named coupled tensor ring factorization (CTRF), for HSR.

Super-Resolution

H-OWAN: Multi-distorted Image Restoration with Tensor 1x1 Convolution

no code implementations29 Jan 2020 Zihao Huang, Chao Li, Feng Duan, Qibin Zhao

It is a challenging task to restore images from their variants with combined distortions.

Image Restoration

Graph Regularized Nonnegative Tensor Ring Decomposition for Multiway Representation Learning

no code implementations12 Oct 2020 Yuyuan Yu, Guoxu Zhou, Ning Zheng, Shengli Xie, Qibin Zhao

Tensor ring (TR) decomposition is a powerful tool for exploiting the low-rank nature of multiway data and has demonstrated great potential in a variety of important applications.

Clustering Representation Learning

Fast Hypergraph Regularized Nonnegative Tensor Ring Factorization Based on Low-Rank Approximation

no code implementations6 Sep 2021 Xinhai Zhao, Yuyuan Yu, Guoxu Zhou, Qibin Zhao, Weijun Sun

For the high dimensional data representation, nonnegative tensor ring (NTR) decomposition equipped with manifold learning has become a promising model to exploit the multi-dimensional structure and extract the feature from tensor data.

Defending Graph Neural Networks via Tensor-Based Robust Graph Aggregation

no code implementations29 Sep 2021 Jianfu Zhang, Yan Hong, Dawei Cheng, Liqing Zhang, Qibin Zhao

In this paper, we propose a tensor-based framework for GNNs to learn robust graphs from adversarial graphs by aggregating predefined robust graphs to enhance the robustness of GNNs via tensor approximation.

Manifold Modeling in Embedded Space: A Perspective for Interpreting "Deep Image Prior"

no code implementations25 Sep 2019 Tatsuya Yokota, Hidekata Hontani, Qibin Zhao, Andrzej Cichocki

The proposed approach is dividing the convolution into ``delay-embedding'' and ``transformation (\ie encoder-decoder)'', and proposing a simple, but essential, image/tensor modeling method which is closely related to dynamical systems and self-similarity.

Denoising Image Reconstruction +2

Supervised learning with incomplete data via sparse representations

no code implementations25 Sep 2019 Cesar F. Caiafa, Ziyao Wang, Jordi Solé-Casals, Qibin Zhao

This paper addresses the problem of training a classifier on incomplete data and its application to a complete or incomplete test dataset.

Imputation

Graph-Constrained Structure Search for Tensor Network Representation

no code implementations NeurIPS 2021 Chao Li, Junhua Zeng, Zerui Tao, Qibin Zhao

Recent works paid effort on the structure search issue for tensor network (TN) representation, of which the aim is to select the optimal network for TN contraction to fit a tensor.

Multi-view Data Classification with a Label-driven Auto-weighted Strategy

no code implementations3 Jan 2022 Yuyuan Yu, Guoxu Zhou, Haonan Huang, Shengli Xie, Qibin Zhao

However, existing strategies cannot take advantage of semi-supervised information, only distinguishing the importance of views from a data feature perspective, which is often influenced by low-quality views then leading to poor performance.

MULTI-VIEW LEARNING

Noisy Tensor Completion via Low-rank Tensor Ring

no code implementations14 Mar 2022 Yuning Qiu, Guoxu Zhou, Qibin Zhao, Shengli Xie

Experimental results on both synthetic and real-world data demonstrate the effectiveness and efficiency of the proposed model in recovering noisy incomplete tensor data compared with state-of-the-art tensor completion models.

Tensor Decomposition

Latent Matrices for Tensor Network Decomposition and to Tensor Completion

no code implementations7 Oct 2022 Peilin Yang, Weijun Sun, Qibin Zhao, Guoxu Zhou

The prevalent fully-connected tensor network (FCTN) has achieved excellent success to compress data.

Tensor Decomposition

SVDinsTN: A Tensor Network Paradigm for Efficient Structure Search from Regularized Modeling Perspective

no code implementations24 May 2023 Yu-Bang Zheng, Xi-Le Zhao, Junhua Zeng, Chao Li, Qibin Zhao, Heng-Chao Li, Ting-Zhu Huang

To address this issue, we propose a novel TN paradigm, named SVD-inspired TN decomposition (SVDinsTN), which allows us to efficiently solve the TN-SS problem from a regularized modeling perspective, eliminating the repeated structure evaluations.

Revisiting Generalized p-Laplacian Regularized Framelet GCNs: Convergence, Energy Dynamic and Training with Non-Linear Diffusion

no code implementations25 May 2023 Dai Shi, Zhiqi Shao, Yi Guo, Qibin Zhao, Junbin Gao

We conduct a convergence analysis on pL-UFG, addressing the gap in the understanding of its asymptotic behaviors.

Semi-supervised multi-view concept decomposition

no code implementations3 Jul 2023 Qi Jiang, Guoxu Zhou, Qibin Zhao

Concept Factorization (CF), as a novel paradigm of representation learning, has demonstrated superior performance in multi-view clustering tasks.

Clustering Representation Learning

Unifying over-smoothing and over-squashing in graph neural networks: A physics informed approach and beyond

no code implementations6 Sep 2023 Zhiqi Shao, Dai Shi, Andi Han, Yi Guo, Qibin Zhao, Junbin Gao

To explore more flexible filtering conditions, we further generalize MHKG into a model termed G-MHKG and thoroughly show the roles of each element in controlling over-smoothing, over-squashing and expressive power.

EpilepsyLLM: Domain-Specific Large Language Model Fine-tuned with Epilepsy Medical Knowledge

no code implementations11 Jan 2024 Xuyang Zhao, Qibin Zhao, Toshihisa Tanaka

Based on those powerful LLMs, the model fine-tuned with domain-specific datasets posseses more specialized knowledge and thus is more practical like medical LLMs.

Language Modelling Large Language Model

Efficient Nonparametric Tensor Decomposition for Binary and Count Data

1 code implementation15 Jan 2024 Zerui Tao, Toshihisa Tanaka, Qibin Zhao

Finally, to address the computational issue of GPs, we enhance the model by incorporating sparse orthogonal variational inference of inducing points, which offers a more effective covariance approximation within GPs and stochastic natural gradient updates for nonparametric models.

Tensor Decomposition Variational Inference

Discovering More Effective Tensor Network Structure Search Algorithms via Large Language Models (LLMs)

no code implementations4 Feb 2024 Junhua Zeng, Guoxu Zhou, Chao Li, Zhun Sun, Qibin Zhao

Tensor network structure search (TN-SS), aiming at searching for suitable tensor network (TN) structures in representing high-dimensional problems, largely promotes the efficacy of TN in various machine learning applications.

Image Compression

Tensor Star Decomposition

no code implementations15 Mar 2024 Wuyang Zhou, Yu-Bang Zheng, Qibin Zhao, Danilo Mandic

A novel tensor decomposition framework, termed Tensor Star (TS) decomposition, is proposed which represents a new type of tensor network decomposition based on tensor contractions.

Tensor Decomposition

Robust Diffusion Models for Adversarial Purification

no code implementations24 Mar 2024 Guang Lin, Zerui Tao, Jianhai Zhang, Toshihisa Tanaka, Qibin Zhao

We propose a novel robust reverse process with adversarial guidance, which is independent of given pre-trained DMs and avoids retraining or fine-tuning the DMs.

Cannot find the paper you are looking for? You can Submit a new open access paper.