no code implementations • 15 Sep 2023 • Chun Hei Lo, Wai Lam, Hong Cheng, Guy Emerson
Functional Distributional Semantics (FDS) models the meaning of words by truth-conditional functions.
no code implementations • ACL 2022 • Stefania Preda, Guy Emerson
In this work, we explore the novel idea of employing dependency parsing information in the context of few-shot learning, the task of learning the meaning of a rare word based on a limited amount of context sentences.
3 code implementations • 30 Apr 2022 • Fangyu Liu, Guy Emerson, Nigel Collier
Spatial relations are a basic part of human cognition.
Ranked #1 on Visual Reasoning on VSR
no code implementations • ACL 2022 • Yinhong Liu, Guy Emerson
In this work, we propose a method to train a Functional Distributional Semantics model with grounded visual data.
1 code implementation • EACL 2021 • James Hargreaves, Andreas Vlachos, Guy Emerson
For this reason, it is common to rerank the output of beam search, but this relies on beam search to produce a good set of hypotheses, which limits the potential gains.
no code implementations • EMNLP 2020 • Jun Yen Leung, Guy Emerson, Ryan Cotterell
Across languages, multiple consecutive adjectives modifying a noun (e. g. "the big red dog") follow certain unmarked ordering rules.
no code implementations • PaM 2020 • Guy Emerson
Functional Distributional Semantics provides a computationally tractable framework for learning truth-conditional semantics from a corpus.
1 code implementation • ACL 2020 • Guy Emerson
Functional Distributional Semantics provides a linguistically interpretable framework for distributional semantics, by representing the meaning of a word as a function (a binary classifier), instead of a vector.
no code implementations • ACL 2020 • Guy Emerson
Distributional semantic models have become a mainstay in NLP, providing useful features for downstream tasks.
no code implementations • WS 2019 • Jeroen Van Hautte, Guy Emerson, Marek Rei
Word embeddings are an essential component in a wide range of natural language processing applications.
no code implementations • WS 2020 • Sebastian Borgeaud, Guy Emerson
We propose a method for natural language generation, choosing the most representative output rather than the most likely output.
no code implementations • WS 2019 • Paula Czarnowska, Guy Emerson, Ann Copestake
Distributional Semantic Models (DSMs) construct vector representations of word meanings based on their contexts.
no code implementations • WS 2017 • Guy Emerson, Ann Copestake
Semantic composition remains an open problem for vector space models of semantics.
no code implementations • 1 Sep 2017 • Guy Emerson, Ann Copestake
Functional Distributional Semantics is a framework that aims to learn, from text, semantic representations which can be interpreted in terms of truth.
no code implementations • WS 2016 • Guy Emerson, Ann Copestake
Vector space models have become popular in distributional semantics, despite the challenges they face in capturing various semantic phenomena.
no code implementations • LREC 2016 • Ann Copestake, Guy Emerson, Michael Wayne Goodman, Matic Horvat, Alex Kuhnle, er, Ewa Muszy{\'n}ska
We describe resources aimed at increasing the usability of the semantic representations utilized within the DELPH-IN (Deep Linguistic Processing with HPSG) consortium.