Search Results for author: Tamara von Glehn

Found 3 papers, 1 papers with code

Formalising Concepts as Grounded Abstractions

no code implementations13 Jan 2021 Stephen Clark, Alexander Lerchner, Tamara von Glehn, Olivier Tieleman, Richard Tanburn, Misha Dashevskiy, Matko Bosnjak

The mathematics of partial orders and lattices is a standard tool for modelling conceptual spaces (Ch. 2, Mitchell (1997), Ganter and Obiedkov (2016)); however, there is no formal work that we are aware of which defines a conceptual lattice on top of a representation that is induced using unsupervised deep learning (Goodfellow et al., 2016).

Representation Learning

Grounded Language Learning Fast and Slow

1 code implementation ICLR 2021 Felix Hill, Olivier Tieleman, Tamara von Glehn, Nathaniel Wong, Hamza Merzic, Stephen Clark

Recent work has shown that large text-based neural language models, trained with conventional supervised learning objectives, acquire a surprising propensity for few- and one-shot learning.

Grounded language learning Meta-Learning +1

Cannot find the paper you are looking for? You can Submit a new open access paper.