Tensor Networks

59 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Tensor Networks models and implementations

Most implemented papers

Can recursive neural tensor networks learn logical reasoning?

sleepinyourhat/vector-entailment 21 Dec 2013

Recursive neural network models and their accompanying vector representations for words have seen success in an array of increasingly semantically sophisticated tasks, but almost nothing is known about their ability to accurately capture the aspects of linguistic meaning that are necessary for interpretation or reasoning.

Tensor Ring Decomposition

zhaoxile/reproducible-tensor-completion-state-of-the-art 17 Jun 2016

In this paper, we introduce a fundamental tensor decomposition model to represent a large dimensional tensor by a circular multilinear products over a sequence of low dimensional cores, which can be graphically interpreted as a cyclic interconnection of 3rd-order tensors, and thus termed as tensor ring (TR) decomposition.

Supervised Learning with Tensor Networks

emstoudenmire/TNML NeurIPS 2016

Tensor networks are approximations of high-order tensors which are efficient to work with and have been very successful for physics and mathematics applications.

Logic Tensor Networks for Semantic Image Interpretation

sbadredd/semantic-pascal-part 24 May 2017

Logic Tensor Networks (LTNs) are an SRL framework which integrates neural networks with first-order fuzzy logic to allow (i) efficient learning from noisy data in the presence of logical constraints, and (ii) reasoning with logical formulas describing general properties of the data.

Tensor Networks for Dimensionality Reduction and Large-Scale Optimizations. Part 2 Applications and Future Perspectives

rballester/ttrecipes 30 Aug 2017

Part 2 of this monograph builds on the introduction to tensor networks and their operations presented in Part 1.

On the Long-Term Memory of Deep Recurrent Networks

HUJI-Deep/Long-Term-Memory-of-Deep-RNNs 25 Oct 2017

A key attribute that drives the unprecedented success of modern Recurrent Neural Networks (RNNs) on learning tasks which involve sequential data, is their ability to model intricate long-term temporal dependencies.

A Generalized Language Model in Tensor Space

TJUIRLAB/AAAI19-TSLM 31 Jan 2019

Theoretically, we prove that such tensor representation is a generalization of the n-gram language model.

TensorNetwork for Machine Learning

google/TensorNetwork 7 Jun 2019

We demonstrate the use of tensor networks for image classification with the TensorNetwork open source library.

Expressive power of tensor-network factorizations for probabilistic modeling, with applications from hidden Markov models to quantum machine learning

glivan/tensor_networks_for_probabilistic_modeling 8 Jul 2019

Inspired by these developments, and the natural correspondence between tensor networks and probabilistic graphical models, we provide a rigorous analysis of the expressive power of various tensor-network factorizations of discrete multivariate probability distributions.

Efficient Contraction of Large Tensor Networks for Weighted Model Counting through Graph Decompositions

vardigroup/TensorOrder 12 Aug 2019

We show that tree decompositions can be used both to find carving decompositions and to factor tensor networks with high-rank, structured tensors.