Tensor Decomposition

128 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Tensor Decomposition models and implementations
3 papers
82

Most implemented papers

Bayesian Sparse Tucker Models for Dimension Reduction and Tensor Completion

lijunsun/bgcp_imputation 10 May 2015

Tucker decomposition is the cornerstone of modern machine learning on tensorial data analysis, which have attracted considerable attention for multiway feature extraction, compressive sensing, and tensor completion.

Tensor Ring Decomposition

zhaoxile/reproducible-tensor-completion-state-of-the-art 17 Jun 2016

In this paper, we introduce a fundamental tensor decomposition model to represent a large dimensional tensor by a circular multilinear products over a sequence of low dimensional cores, which can be graphically interpreted as a cyclic interconnection of 3rd-order tensors, and thus termed as tensor ring (TR) decomposition.

Shape Constrained Tensor Decompositions using Sparse Representations in Over-Complete Libraries

BethanyL/SCTD 16 Aug 2016

We consider $N$-way data arrays and low-rank tensor factorizations where the time mode is coded as a sparse linear combination of temporal elements from an over-complete library.

Sublinear Time Orthogonal Tensor Decomposition

huanzhang12/sampling_tensor_decomp NeurIPS 2016

We show in a number of cases one can achieve the same theoretical guarantees in sublinear time, i. e., even without reading most of the input tensor.

Introduction to Tensor Decompositions and their Applications in Machine Learning

yidilozdemir/fmriBridge 29 Nov 2017

Tensors are multidimensional arrays of numerical values and therefore generalize matrices to multiple dimensions.

Fitting Low-Rank Tensors in Constant Time

hayasick/CTFT NeurIPS 2017

Then, we show that the residual error of the Tucker decomposition of $\tilde{X}$ is sufficiently close to that of $X$ with high probability.

Legendre Decomposition for Tensors

mahito-sugiyama/Legendre-decomposition NeurIPS 2018

We present a novel nonnegative tensor decomposition method, called Legendre decomposition, which factorizes an input tensor into a multiplicative combination of parameters.

CoVeR: Learning Covariate-Specific Vector Representations with Tensor Decompositions

justinaL/tag ICML 2018

However, in addition to the text data itself, we often have additional covariates associated with individual corpus documents---e. g. the demographic of the author, time and venue of publication---and we would like the embedding to naturally capture this information.

Learning Binary Latent Variable Models: A Tensor Eigenpair Approach

arJaffe/BinaryLatentVariables ICML 2018

Latent variable models with hidden binary units appear in various applications.

Tensor Decomposition for Compressing Recurrent Neural Network

androstj/tensor_rnn 28 Feb 2018

In the machine learning fields, Recurrent Neural Network (RNN) has become a popular architecture for sequential data modeling.