# Tensor Networks

52 papers with code • 0 benchmarks • 0 datasets

## Benchmarks

These leaderboards are used to track progress in Tensor Networks
## Libraries

Use these libraries to find Tensor Networks models and implementations## Most implemented papers

# Supervised Learning with Quantum-Inspired Tensor Networks

Tensor networks are efficient representations of high-dimensional tensors which have been very successful for physics and mathematics applications.

# Machine Learning by Unitary Tensor Network of Hierarchical Tree Structure

We study the quantum features of the TN states, including quantum entanglement and fidelity.

# TensorNetwork on TensorFlow: A Spin Chain Application Using Tree Tensor Networks

TensorNetwork is an open source library for implementing tensor network algorithms in TensorFlow.

# Logic Tensor Networks: Deep Learning and Logical Reasoning from Data and Knowledge

We propose Logic Tensor Networks: a uniform framework for integrating automatic learning and reasoning.

# Ask Me Even More: Dynamic Memory Tensor Networks (Extended Model)

We propose extensions for the Dynamic Memory Network (DMN), specifically within the attention mechanism, we call the resulting Neural Architecture as Dynamic Memory Tensor Network (DMTN).

# TensorNetwork: A Library for Physics and Machine Learning

TensorNetwork is an open source library for implementing tensor network algorithms.

# Compensating Supervision Incompleteness with Prior Knowledge in Semantic Image Interpretation

This requires the detection of visual relationships: triples (subject, relation, object) describing a semantic relation between a subject and an object.

# Tensor Networks for Medical Image Classification

With the increasing adoption of machine learning tools like neural networks across several domains, interesting connections and comparisons to concepts from other domains are coming to light.

# Representing Prior Knowledge Using Randomly, Weighted Feature Networks for Visual Relationship Detection

Furthermore, background knowledge represented by RWFNs can be used to alleviate the incompleteness of training sets even though the space complexity of RWFNs is much smaller than LTNs (1:27 ratio).

# Can recursive neural tensor networks learn logical reasoning?

Recursive neural network models and their accompanying vector representations for words have seen success in an array of increasingly semantically sophisticated tasks, but almost nothing is known about their ability to accurately capture the aspects of linguistic meaning that are necessary for interpretation or reasoning.