Search Results for author: Guy Emerson

Found 19 papers, 4 papers with code

Distributional Inclusion Hypothesis and Quantifications: Probing for Hypernymy in Functional Distributional Semantics

no code implementations15 Sep 2023 Chun Hei Lo, Wai Lam, Hong Cheng, Guy Emerson

Functional Distributional Semantics (FDS) models the meaning of words by truth-conditional functions.

Using dependency parsing for few-shot learning in distributional semantics

no code implementations ACL 2022 Stefania Preda, Guy Emerson

In this work, we explore the novel idea of employing dependency parsing information in the context of few-shot learning, the task of learning the meaning of a rare word based on a limited amount of context sentences.

Dependency Parsing Few-Shot Learning

Learning Functional Distributional Semantics with Visual Data

no code implementations ACL 2022 Yinhong Liu, Guy Emerson

In this work, we propose a method to train a Functional Distributional Semantics model with grounded visual data.

Language Acquisition

Incremental Beam Manipulation for Natural Language Generation

1 code implementation EACL 2021 James Hargreaves, Andreas Vlachos, Guy Emerson

For this reason, it is common to rerank the output of beam search, but this relies on beam search to produce a good set of hypotheses, which limits the potential gains.

Text Generation

Investigating Cross-Linguistic Adjective Ordering Tendencies with a Latent-Variable Model

no code implementations EMNLP 2020 Jun Yen Leung, Guy Emerson, Ryan Cotterell

Across languages, multiple consecutive adjectives modifying a noun (e. g. "the big red dog") follow certain unmarked ordering rules.

Linguists Who Use Probabilistic Models Love Them: Quantification in Functional Distributional Semantics

no code implementations PaM 2020 Guy Emerson

Functional Distributional Semantics provides a computationally tractable framework for learning truth-conditional semantics from a corpus.

Bayesian Inference

What are the Goals of Distributional Semantics?

no code implementations ACL 2020 Guy Emerson

Distributional semantic models have become a mainstay in NLP, providing useful features for downstream tasks.

Autoencoding Pixies: Amortised Variational Inference with Graph Convolutions for Functional Distributional Semantics

1 code implementation ACL 2020 Guy Emerson

Functional Distributional Semantics provides a linguistically interpretable framework for distributional semantics, by representing the meaning of a word as a function (a binary classifier), instead of a vector.

Language Modelling Semantic Composition +3

Variational Inference for Logical Inference

no code implementations1 Sep 2017 Guy Emerson, Ann Copestake

Functional Distributional Semantics is a framework that aims to learn, from text, semantic representations which can be interpreted in terms of truth.

Semantic Similarity Semantic Textual Similarity +1

Functional Distributional Semantics

no code implementations WS 2016 Guy Emerson, Ann Copestake

Vector space models have become popular in distributional semantics, despite the challenges they face in capturing various semantic phenomena.

Bayesian Inference BIG-bench Machine Learning

Resources for building applications with Dependency Minimal Recursion Semantics

no code implementations LREC 2016 Ann Copestake, Guy Emerson, Michael Wayne Goodman, Matic Horvat, Alex Kuhnle, er, Ewa Muszy{\'n}ska

We describe resources aimed at increasing the usability of the semantic representations utilized within the DELPH-IN (Deep Linguistic Processing with HPSG) consortium.

Cannot find the paper you are looking for? You can Submit a new open access paper.