Search Results for author: Mehrnoosh Sadrzadeh

Found 49 papers, 7 papers with code

A toy distributional model for fuzzy generalised quantifiers

no code implementations PaM 2020 Mehrnoosh Sadrzadeh, Gijs Wijnholds

It is possible to overcome the computational hurdles by working with fuzzy generalised quantifiers.

Developments in Sheaf-Theoretic Models of Natural Language Ambiguities

no code implementations7 Feb 2024 Kin Ian Lo, Mehrnoosh Sadrzadeh, Shane Mansfield

Then, we show how an extension of the natural language processing challenge, known as the Winograd Schema, which involves anaphoric ambiguities can be modelled on the Bell-CHSH scenario with a contextual fraction of 0. 096.

Towards Transparency in Coreference Resolution: A Quantum-Inspired Approach

1 code implementation1 Dec 2023 Hadi Wazni, Mehrnoosh Sadrzadeh

Previous work extended the QNLP translation to discourse structure using points in a closure of Hilbert spaces.

Binary Classification coreference-resolution +2

Generalised Winograd Schema and its Contextuality

no code implementations31 Aug 2023 Kin Ian Lo, Mehrnoosh Sadrzadeh, Shane Mansfield

In this work, we focus on coreference ambiguities and investigate the Winograd Schema Challenge (WSC), a test proposed by Levesque in 2011 to evaluate the intelligence of machines.

coreference-resolution Multiple-choice

Proceedings Modalities in substructural logics: Applications at the interfaces of logic, language and computation

no code implementations1 Aug 2023 Michael Moortgat, Mehrnoosh Sadrzadeh

By calling into question the implicit structural rules that are taken for granted in classical logic, substructural logics have brought to the fore new forms of reasoning with applications in many interdisciplinary areas of interest.

Management

A Model of Anaphoric Ambiguities using Sheaf Theoretic Quantum-like Contextuality and BERT

no code implementations11 Aug 2022 Kin Ian Lo, Mehrnoosh Sadrzadeh, Shane Mansfield

Ambiguities of natural language do not preclude us from using it and context helps in getting ideas across.

A Quantum Natural Language Processing Approach to Pronoun Resolution

no code implementations10 Aug 2022 Hadi Wazni, Kin Ian Lo, Lachlan McPheat, Mehrnoosh Sadrzadeh

We use the Lambek Calculus with soft sub-exponential modalities to model and reason about discourse relations such as anaphora and ellipsis.

Translation

The Causal Structure of Semantic Ambiguities

no code implementations14 Jun 2022 Daphne Wang, Mehrnoosh Sadrzadeh

Ambiguity is a natural language phenomenon occurring at different levels of syntax, semantics, and pragmatics.

Permutation invariant matrix statistics and computational language tasks

1 code implementation14 Feb 2022 Manuel Accettulli Huber, Adriana Correia, Sanjaye Ramgoolam, Mehrnoosh Sadrzadeh

The Linguistic Matrix Theory programme introduced by Kartsaklis, Ramgoolam and Sadrzadeh is an approach to the statistics of matrices that are generated in type-driven distributional semantics, based on permutation invariant polynomial functions which are regarded as the key observables encoding the significant statistics.

Vector Space Semantics for Lambek Calculus with Soft Subexponentials

no code implementations22 Nov 2021 Lachlan McPheat, Hadi Wazni, Mehrnoosh Sadrzadeh

We develop a vector space semantics for Lambek Calculus with Soft Subexponentials, apply the calculus to construct compositional vector interpretations for parasitic gap noun phrases and discourse units with anaphora and ellipsis, and experiment with the constructions in a distributional sentence similarity task.

Sentence Sentence Similarity

Pregroup Grammars, their Syntax and Semantics

no code implementations23 Sep 2021 Mehrnoosh Sadrzadeh

Pregroup grammars were developed in 1999 and stayed Lambek's preferred algebraic model of grammar.

Cosine Similarity of Multimodal Content Vectors for TV Programmes

no code implementations23 Sep 2020 Saba Nazir. Taner Cagali, Chris Newell, Mehrnoosh Sadrzadeh

Multimodal information originates from a variety of sources: audiovisual files, textual descriptions, and metadata.

A Frobenius Algebraic Analysis for Parasitic Gaps

no code implementations12 May 2020 Michael Moortgat, Mehrnoosh Sadrzadeh, Gijs Wijnholds

The interpretation of parasitic gaps is an ostensible case of non-linearity in natural language composition.

Translation

Categorical Vector Space Semantics for Lambek Calculus with a Relevant Modality

no code implementations6 May 2020 Lachlan McPheat, Mehrnoosh Sadrzadeh, Hadi Wazni, Gijs Wijnholds

We develop a categorical compositional distributional semantics for Lambek Calculus with a Relevant Modality ! L*, which has a limited edition of the contraction and permutation rules.

Sentence

Incremental Monoidal Grammars

no code implementations2 Jan 2020 Dan Shiebler, Alexis Toumi, Mehrnoosh Sadrzadeh

In this work we define formal grammars in terms of free monoidal categories, along with a functor from the category of formal grammars to the category of automata.

BIG-bench Machine Learning Language Modelling

Gaussianity and typicality in matrix distributional semantics

1 code implementation19 Dec 2019 Sanjaye Ramgoolam, Mehrnoosh Sadrzadeh, Lewis Sword

Using the general 13-parameter permutation invariant Gaussian matrix models recently solved, we find, using a dataset of matrices constructed via standard techniques in distributional semantics, that the expectation values of a large class of cubic and quartic observables show high gaussianity at levels between 90 to 99 percent.

Evaluating Composition Models for Verb Phrase Elliptical Sentence Embeddings

1 code implementation NAACL 2019 Gijs Wijnholds, Mehrnoosh Sadrzadeh

Our results show that non-linear addition and a non-linear tensor-based composition outperform the naive non-compositional baselines and the linear models, and that sentence encoders perform well on sentence similarity, but not on verb disambiguation.

Sentence Sentence Embeddings +1

A Typedriven Vector Semantics for Ellipsis with Anaphora using Lambek Calculus with Limited Contraction

no code implementations5 May 2019 Gijs Wijnholds, Mehrnoosh Sadrzadeh

We review previous compositional distributional models of relative pronouns, coordination and a restricted account of ellipsis in the DisCoCat framework of Coecke et al. (2010, 2013).

Sentence Word Embeddings

Classical Copying versus Quantum Entanglement in Natural Language: The Case of VP-ellipsis

no code implementations8 Nov 2018 Gijs Wijnholds, Mehrnoosh Sadrzadeh

This paper compares classical copying and quantum entanglement in natural language by considering the case of verb phrase (VP) ellipsis.

Exploring Semantic Incrementality with Dynamic Syntax and Vector Space Semantics

no code implementations1 Nov 2018 Mehrnoosh Sadrzadeh, Matthew Purver, Julian Hough, Ruth Kempson

One of the fundamental requirements for models of semantic processing in dialogue is incrementality: a model must reflect how people interpret and generate language at least on a word-by-word basis, and handle phenomena such as fragments, incomplete and jointly-produced utterances.

Static and Dynamic Vector Semantics for Lambda Calculus Models of Natural Language

no code implementations26 Oct 2018 Mehrnoosh Sadrzadeh, Reinhard Muskens

Vector models of language are based on the contextual aspects of language, the distributions of words and how they co-occur in text.

Sentence

Linguistic Matrix Theory

no code implementations28 Mar 2017 Dimitrios Kartsaklis, Sanjaye Ramgoolam, Mehrnoosh Sadrzadeh

We propose a Matrix Theory approach to this data, based on permutation symmetry along with Gaussian weights and their perturbations.

Compositional Distributional Models of Meaning

no code implementations COLING 2016 Mehrnoosh Sadrzadeh, Dimitri Kartsaklis

Compositional distributional models of meaning (CDMs) provide a function that produces a vectorial representation for a phrase or a sentence by composing the vectors of its words.

Machine Translation Natural Language Inference +2

Distributional Inclusion Hypothesis for Tensor-based Composition

no code implementations COLING 2016 Dimitri Kartsaklis, Mehrnoosh Sadrzadeh

According to the distributional inclusion hypothesis, entailment between words can be measured via the feature inclusions of their distributional vectors.

Sentence

Quantifier Scope in Categorical Compositional Distributional Semantics

no code implementations4 Aug 2016 Mehrnoosh Sadrzadeh

In previous work with J. Hedges, we formalised a generalised quantifiers theory of natural language in categorical compositional distributional semantics with the help of bialgebras.

A Generalised Quantifier Theory of Natural Language in Categorical Compositional Distributional Semantics with Bialgebras

no code implementations4 Feb 2016 Jules Hedges, Mehrnoosh Sadrzadeh

Categorical compositional distributional semantics is a model of natural language; it combines the statistical vector space models of words with the compositional models of grammar.

Sentence Entailment in Compositional Distributional Semantics

no code implementations14 Dec 2015 Esma Balkir, Dimitri Kartsaklis, Mehrnoosh Sadrzadeh

In categorical compositional distributional semantics, phrase and sentence representations are functions of their grammatical structure and representations of the words therein.

Sentence

Distributional Sentence Entailment Using Density Matrices

no code implementations22 Jun 2015 Esma Balkir, Mehrnoosh Sadrzadeh, Bob Coecke

Categorical compositional distributional model of Coecke et al. (2010) suggests a way to combine grammatical composition of the formal, type logical models with the corpus based, empirical word representations of distributional semantics.

Lexical Entailment Sentence

A Frobenius Model of Information Structure in Categorical Compositional Distributional Semantics

no code implementations WS 2015 Dimitri Kartsaklis, Mehrnoosh Sadrzadeh

The categorical compositional distributional model of Coecke, Sadrzadeh and Clark provides a linguistically motivated procedure for computing the meaning of a sentence as a function of the distributional meaning of the words therein.

Sentence

Open System Categorical Quantum Semantics in Natural Language Processing

no code implementations3 Feb 2015 Robin Piedeleu, Dimitri Kartsaklis, Bob Coecke, Mehrnoosh Sadrzadeh

Moreover, just like CQM allows for varying the model in which we interpret quantum axioms, one can also vary the model in which we interpret word meaning.

Sentence

Evaluating Neural Word Representations in Tensor-Based Compositional Settings

no code implementations EMNLP 2014 Dmitrijs Milajevs, Dimitri Kartsaklis, Mehrnoosh Sadrzadeh, Matthew Purver

We provide a comparative study between neural word representations and traditional vector spaces based on co-occurrence counts, in a number of compositional tasks.

Sentence Sentence Similarity +1

Resolving Lexical Ambiguity in Tensor Regression Models of Meaning

no code implementations ACL 2014 Dimitri Kartsaklis, Nal Kalchbrenner, Mehrnoosh Sadrzadeh

This paper provides a method for improving tensor-based compositional distributional models of meaning by the addition of an explicit disambiguation step prior to composition.

regression

The Frobenius anatomy of word meanings II: possessive relative pronouns

no code implementations18 Jun 2014 Mehrnoosh Sadrzadeh, Stephen Clark, Bob Coecke

Within the categorical compositional distributional model of meaning, we provide semantic interpretations for the subject and object roles of the possessive relative pronoun `whose'.

Anatomy Object

A Study of Entanglement in a Categorical Framework of Natural Language

no code implementations12 May 2014 Dimitri Kartsaklis, Mehrnoosh Sadrzadeh

In both quantum mechanics and corpus linguistics based on vector spaces, the notion of entanglement provides a means for the various subsystems to communicate with each other.

The Frobenius anatomy of word meanings I: subject and object relative pronouns

no code implementations21 Apr 2014 Mehrnoosh Sadrzadeh, Stephen Clark, Bob Coecke

This paper develops a compositional vector-based semantics of subject and object relative pronouns within a categorical framework.

Anatomy

Semantic Unification A sheaf theoretic approach to natural language

no code implementations13 Mar 2014 Samson Abramsky, Mehrnoosh Sadrzadeh

Language is contextual and sheaf theory provides a high level mathematical framework to model contextuality.

Sentence

Reasoning about Meaning in Natural Language with Compact Closed Categories and Frobenius Algebras

no code implementations23 Jan 2014 Dimitri Kartsaklis, Mehrnoosh Sadrzadeh, Stephen Pulman, Bob Coecke

They also provide semantics for Lambek's pregroup algebras, applied to formalizing the grammatical structure of natural language, and are implicit in a distributional model of word meaning based on vector spaces.

A quantum teleportation inspired algorithm produces sentence meaning from word meaning and grammatical structure

no code implementations2 May 2013 Stephen Clark, Bob Coecke, Edward Grefenstette, Stephen Pulman, Mehrnoosh Sadrzadeh

We discuss an algorithm which produces the meaning of a sentence given meanings of its words, and its resemblance to quantum teleportation.

Sentence

Experimental Support for a Categorical Compositional Distributional Model of Meaning

1 code implementation20 Jun 2011 Edward Grefenstette, Mehrnoosh Sadrzadeh

The evaluation is based on the word disambiguation task developed by Mitchell and Lapata (2008) for intransitive sentences, and on a similar new experiment designed for transitive sentences.

Mathematical Foundations for a Compositional Distributional Model of Meaning

2 code implementations23 Mar 2010 Bob Coecke, Mehrnoosh Sadrzadeh, Stephen Clark

We propose a mathematical framework for a unification of the distributional theory of meaning in terms of vector space models, and a compositional theory for grammatical types, for which we rely on the algebra of Pregroups, introduced by Lambek.

Sentence

Cannot find the paper you are looking for? You can Submit a new open access paper.