no code implementations • 11 Dec 2019 • Benjamin Kolb, Leon Lang, Henning Bartsch, Arwin Gansekoele, Raymond Koopmanschap, Leonardo Romor, David Speck, Mathijs Mul, Elia Bruni
Previous research into agent communication has shown that a pre-trained guide can speed up the learning process of an imitation learning agent.
no code implementations • WS 2019 • Benjamin Kolb, Leon Lang, Henning Bartsch, Arwin Gansekoele, Raymond Koopmanschap, Leonardo Romor, David Speck, Mathijs Mul, Elia Bruni
Previous research into agent communication has shown that a pre-trained guide can speed up the learning process of an imitation learning agent.
1 code implementation • 22 Aug 2019 • Dieuwke Hupkes, Verna Dankers, Mathijs Mul, Elia Bruni
Despite a multitude of empirical studies, little consensus exists on whether neural networks are able to generalise compositionally, a controversy that, in part, stems from a lack of agreement about what it means for a neural model to be compositional.
no code implementations • 14 Aug 2019 • Mathijs Mul, Diane Bouchacourt, Elia Bruni
A typical setup to achieve this is with a scripted teacher which guides a virtual agent using language instructions.
no code implementations • 1 Jun 2019 • Mathijs Mul, Willem Zuidema
We approach this classic question with current methods, and demonstrate that recurrent neural networks can learn to recognize first order logical entailment relations between expressions.