Search Results for author: Mathijs Mul

Found 5 papers, 1 papers with code

Learning to Request Guidance in Emergent Communication

no code implementations11 Dec 2019 Benjamin Kolb, Leon Lang, Henning Bartsch, Arwin Gansekoele, Raymond Koopmanschap, Leonardo Romor, David Speck, Mathijs Mul, Elia Bruni

Previous research into agent communication has shown that a pre-trained guide can speed up the learning process of an imitation learning agent.

Imitation Learning

Learning to request guidance in emergent language

no code implementations WS 2019 Benjamin Kolb, Leon Lang, Henning Bartsch, Arwin Gansekoele, Raymond Koopmanschap, Leonardo Romor, David Speck, Mathijs Mul, Elia Bruni

Previous research into agent communication has shown that a pre-trained guide can speed up the learning process of an imitation learning agent.

Imitation Learning

Compositionality decomposed: how do neural networks generalise?

1 code implementation22 Aug 2019 Dieuwke Hupkes, Verna Dankers, Mathijs Mul, Elia Bruni

Despite a multitude of empirical studies, little consensus exists on whether neural networks are able to generalise compositionally, a controversy that, in part, stems from a lack of agreement about what it means for a neural model to be compositional.

Mastering emergent language: learning to guide in simulated navigation

no code implementations14 Aug 2019 Mathijs Mul, Diane Bouchacourt, Elia Bruni

A typical setup to achieve this is with a scripted teacher which guides a virtual agent using language instructions.

Navigate

Siamese recurrent networks learn first-order logic reasoning and exhibit zero-shot compositional generalization

no code implementations1 Jun 2019 Mathijs Mul, Willem Zuidema

We approach this classic question with current methods, and demonstrate that recurrent neural networks can learn to recognize first order logical entailment relations between expressions.

Relation

Cannot find the paper you are looking for? You can Submit a new open access paper.