no code implementations • 15 Jun 2021 • Tai-Danae Bradley, John Terilla, Yiannis Vlassopoulos
In this paper, we propose a mathematical framework for passing from probability distributions on extensions of given texts, such as the ones learned by today's large language models, to an enriched category containing semantic information.
no code implementations • 8 Jul 2020 • Tai-Danae Bradley, Yiannis Vlassopoulos
We answer this by constructing a functor from our enriched category of text to a particular enriched category of reduced density operators.
no code implementations • 4 Nov 2017 • Vasily Pestun, John Terilla, Yiannis Vlassopoulos
We propose a statistical model for natural language that begins by considering language as a monoid, then representing it in complex matrices with a compatible translation invariant probability measure.
no code implementations • 27 Oct 2017 • Vasily Pestun, Yiannis Vlassopoulos
We propose a new statistical model suitable for machine learning of systems with long distance correlations such as natural languages.