No evaluation results yet. Help compare methods by
submit
evaluation metrics.

Modern data sources are typically of large scale and multi-modal natures, and acquired on irregular domains, which poses serious challenges to traditional deep learning models.

We establish a direct connection between general tensor networks and deep feed-forward artificial neural networks.

We investigate the potential of tensor network based machine learning methods to scale to large image and text data sets.

We take the tensor network describing explicit p-adic CFT partition functions proposed in [1], and considered boundary conditions of the network describing a deformed Bruhat-Tits (BT) tree geometry.

TENSOR NETWORKS HIGH ENERGY PHYSICS - THEORY STRONGLY CORRELATED ELECTRONS GENERAL RELATIVITY AND QUANTUM COSMOLOGY MATHEMATICAL PHYSICS MATHEMATICAL PHYSICS QUANTUM PHYSICS

We propose a unified scheme to identify phase transitions out of the $\mathbb{Z}_2$ Abelian topological order, including the transition to a non-Abelian chiral spin liquid.

TENSOR NETWORKS STRONGLY CORRELATED ELECTRONS

We introduce a hybrid model combining a quantum-inspired tensor network and a variational quantum circuit to perform supervised learning tasks.

Recent progress in studies of holographic dualities, originally motivated by insights from string theory, has led to a confluence with concepts and techniques from quantum information theory.

TENSOR NETWORKS QUANTUM PHYSICS STRONGLY CORRELATED ELECTRONS HIGH ENERGY PHYSICS - THEORY

To answer the latter: as a candidate model class we consider approximation classes of TNs and show that these are (quasi-)Banach spaces, that many types of classical smoothness spaces are continuously embedded into said approximation classes and that TN approximation classes are themselves not embedded in any classical smoothness space.

We show that the weights of a multidimensional regression model can be learned by means of TT network and the optimization of TT weights is a more robust to the impact of coefficient initialization and hyper-parameter setting.

We show that our quantum-inspired generative models based on tensor networks generalize to unseen candidates with lower cost function values than any of the candidates seen by the classical solvers.

COMBINATORIAL OPTIMIZATION PORTFOLIO OPTIMIZATION QUANTUM MACHINE LEARNING TENSOR NETWORKS QUANTUM PHYSICS