Search Results for author: Stephen Clark

Found 64 papers, 7 papers with code

Something Old, Something New: Grammar-based CCG Parsing with Transformer Models

no code implementations21 Sep 2021 Stephen Clark

This report describes the parsing problem for Combinatory Categorial Grammar (CCG), showing how a combination of Transformer-based neural models and a symbolic CCG grammar can lead to substantial gains over existing approaches.

CCG Supertagging

Formalising Concepts as Grounded Abstractions

no code implementations13 Jan 2021 Stephen Clark, Alexander Lerchner, Tamara von Glehn, Olivier Tieleman, Richard Tanburn, Misha Dashevskiy, Matko Bosnjak

The mathematics of partial orders and lattices is a standard tool for modelling conceptual spaces (Ch. 2, Mitchell (1997), Ganter and Obiedkov (2016)); however, there is no formal work that we are aware of which defines a conceptual lattice on top of a representation that is induced using unsupervised deep learning (Goodfellow et al., 2016).

Representation Learning

Learning to Personalize for Web Search Sessions

no code implementations17 Sep 2020 Saad Aloteibi, Stephen Clark

In this paper, we formulate session search as a personalization task under the framework of learning to rank.

Learning-To-Rank

Grounded Language Learning Fast and Slow

1 code implementation ICLR 2021 Felix Hill, Olivier Tieleman, Tamara von Glehn, Nathaniel Wong, Hamza Merzic, Stephen Clark

Recent work has shown that large text-based neural language models, trained with conventional supervised learning objectives, acquire a surprising propensity for few- and one-shot learning.

Grounded language learning Meta-Learning +1

Probing Emergent Semantics in Predictive Agents via Question Answering

no code implementations ICML 2020 Abhishek Das, Federico Carnevale, Hamza Merzic, Laura Rimell, Rosalia Schneider, Josh Abramson, Alden Hung, Arun Ahuja, Stephen Clark, Gregory Wayne, Felix Hill

Recent work has shown how predictive modeling can endow agents with rich knowledge of their surroundings, improving their ability to act in complex environments.

Question Answering

Environmental drivers of systematicity and generalization in a situated agent

no code implementations ICLR 2020 Felix Hill, Andrew Lampinen, Rosalia Schneider, Stephen Clark, Matthew Botvinick, James L. McClelland, Adam Santoro

The question of whether deep neural networks are good at generalising beyond their immediate training experience is of critical importance for learning-based approaches to AI.

Unity

Neural Generative Rhetorical Structure Parsing

no code implementations IJCNLP 2019 Amandla Mabona, Laura Rimell, Stephen Clark, Andreas Vlachos

We show that, for our parser's traversal order, previous beam search algorithms for RNNGs have a left-branching bias which is ill-suited for RST parsing.

Document Classification Document-level

Scalable Syntax-Aware Language Models Using Knowledge Distillation

no code implementations ACL 2019 Adhiguna Kuncoro, Chris Dyer, Laura Rimell, Stephen Clark, Phil Blunsom

Prior work has shown that, on small amounts of training data, syntactic neural language models learn structurally sensitive generalisations more successfully than sequential language models.

Knowledge Distillation Language Modelling

LSTMs Can Learn Syntax-Sensitive Dependencies Well, But Modeling Structure Makes Them Better

no code implementations ACL 2018 Adhiguna Kuncoro, Chris Dyer, John Hale, Dani Yogatama, Stephen Clark, Phil Blunsom

Language exhibits hierarchical structure, but recent work using a subject-verb agreement diagnostic argued that state-of-the-art language models, LSTMs, fail to learn long-range syntax sensitive dependencies.

Hierarchical structure Language Modelling +2

Latent Tree Learning with Differentiable Parsers: Shift-Reduce Parsing and Chart Parsing

no code implementations WS 2018 Jean Maillard, Stephen Clark

Latent tree learning models represent sentences by composing their words according to an induced parse tree, all based on a downstream task.

Factorising AMR generation through syntax

no code implementations NAACL 2019 Kris Cao, Stephen Clark

Generating from Abstract Meaning Representation (AMR) is an underspecified problem, as many syntactic decisions are not constrained by the semantic graph.

Emergent Communication through Negotiation

1 code implementation ICLR 2018 Kris Cao, Angeliki Lazaridou, Marc Lanctot, Joel Z. Leibo, Karl Tuyls, Stephen Clark

We also study communication behaviour in a setting where one agent interacts with agents in a community with different levels of prosociality and show how agent identifiability can aid negotiation.

Multi-agent Reinforcement Learning

Emergence of Linguistic Communication from Referential Games with Symbolic and Pixel Input

no code implementations ICLR 2018 Angeliki Lazaridou, Karl Moritz Hermann, Karl Tuyls, Stephen Clark

The ability of algorithms to evolve or learn (compositional) communication protocols has traditionally been studied in the language evolution literature through the use of emergent communication tasks.

Understanding Grounded Language Learning Agents

no code implementations ICLR 2018 Felix Hill, Karl Moritz Hermann, Phil Blunsom, Stephen Clark

Neural network-based systems can now learn to locate the referents of words and phrases in images, answer questions about visual scenes, and even execute symbolic instructions as first-person actors in partially-observable worlds.

Grounded language learning Policy Gradient Methods

Understanding Early Word Learning in Situated Artificial Agents

no code implementations ICLR 2018 Felix Hill, Stephen Clark, Karl Moritz Hermann, Phil Blunsom

Neural network-based systems can now learn to locate the referents of words and phrases in images, answer questions about visual scenes, and execute symbolic instructions as first-person actors in partially-observable worlds.

Grounded language learning Policy Gradient Methods

Modelling metaphor with attribute-based semantics

no code implementations EACL 2017 Luana Bulat, Stephen Clark, Ekaterina Shutova

One of the key problems in computational metaphor modelling is finding the optimal level of abstraction of semantic representations, such that these are able to capture and generalise metaphorical mechanisms.

Machine Translation Natural Language Inference +1

Latent Variable Dialogue Models and their Diversity

1 code implementation EACL 2017 Kris Cao, Stephen Clark

We present a dialogue generation model that directly captures the variability in possible responses to a given input, which reduces the `boring output' issue of deterministic dialogue models.

Dialogue Generation

Visually Grounded and Textual Semantic Models Differentially Decode Brain Activity Associated with Concrete and Abstract Nouns

no code implementations TACL 2017 Andrew J. Anderson, Douwe Kiela, Stephen Clark, Massimo Poesio

Dual coding theory considers concrete concepts to be encoded in the brain both linguistically and visually, and abstract concepts only linguistically.

Virtual Embodiment: A Scalable Long-Term Strategy for Artificial Intelligence Research

no code implementations24 Oct 2016 Douwe Kiela, Luana Bulat, Anita L. Vero, Stephen Clark

Meaning has been called the "holy grail" of a variety of scientific disciplines, ranging from linguistics to philosophy, psychology and the neurosciences.

Using Sentence Plausibility to Learn the Semantics of Transitive Verbs

no code implementations28 Nov 2014 Tamara Polajnar, Laura Rimell, Stephen Clark

The functional approach to compositional distributional semantics considers transitive verbs to be linear maps that transform the distributional vectors representing nouns into a vector representing a sentence.

The Frobenius anatomy of word meanings II: possessive relative pronouns

no code implementations18 Jun 2014 Mehrnoosh Sadrzadeh, Stephen Clark, Bob Coecke

Within the categorical compositional distributional model of meaning, we provide semantic interpretations for the subject and object roles of the possessive relative pronoun `whose'.

Evaluation of Simple Distributional Compositional Operations on Longer Texts

no code implementations LREC 2014 Tamara Polajnar, Laura Rimell, Stephen Clark

Distributional semantic models have been effective at representing linguistic semantics at the word level, and more recently research has moved on to the construction of distributional representations for larger segments of text.

Semantic Textual Similarity Sentence segmentation

The Frobenius anatomy of word meanings I: subject and object relative pronouns

no code implementations21 Apr 2014 Mehrnoosh Sadrzadeh, Stephen Clark, Bob Coecke

This paper develops a compositional vector-based semantics of subject and object relative pronouns within a categorical framework.

Learning Type-Driven Tensor-Based Meaning Representations

no code implementations20 Dec 2013 Tamara Polajnar, Luana Fagarasan, Stephen Clark

This paper investigates the learning of 3rd-order tensors representing the semantics of transitive verbs.

A quantum teleportation inspired algorithm produces sentence meaning from word meaning and grammatical structure

no code implementations2 May 2013 Stephen Clark, Bob Coecke, Edward Grefenstette, Stephen Pulman, Mehrnoosh Sadrzadeh

We discuss an algorithm which produces the meaning of a sentence given meanings of its words, and its resemblance to quantum teleportation.

Mathematical Foundations for a Compositional Distributional Model of Meaning

2 code implementations23 Mar 2010 Bob Coecke, Mehrnoosh Sadrzadeh, Stephen Clark

We propose a mathematical framework for a unification of the distributional theory of meaning in terms of vector space models, and a compositional theory for grammatical types, for which we rely on the algebra of Pregroups, introduced by Lambek.

Cannot find the paper you are looking for? You can Submit a new open access paper.