Search Results for author: Jack Rae

Found 9 papers, 5 papers with code

Do Transformers Need Deep Long-Range Memory?

no code implementations ACL 2020 Jack Rae, Ali Razavi

Deep attention models have advanced the modelling of sequential data across many domains.

Deep Attention Language Modelling

Multiplicative Interactions and Where to Find Them

no code implementations ICLR 2020 Siddhant M. Jayakumar, Wojciech M. Czarnecki, Jacob Menick, Jonathan Schwarz, Jack Rae, Simon Osindero, Yee Whye Teh, Tim Harley, Razvan Pascanu

We explore the role of multiplicative interaction as a unifying framework to describe a range of classical and modern neural network architectural motifs, such as gating, attention layers, hypernetworks, and dynamic convolutions amongst others.

Training language GANs from Scratch

5 code implementations NeurIPS 2019 Cyprien de Masson d'Autume, Mihaela Rosca, Jack Rae, Shakir Mohamed

Generative Adversarial Networks (GANs) enjoy great success at image generation, but have proven difficult to train in the domain of natural language.

Image Generation Text Generation

Neural Arithmetic Logic Units

21 code implementations NeurIPS 2018 Andrew Trask, Felix Hill, Scott Reed, Jack Rae, Chris Dyer, Phil Blunsom

Neural networks can learn to represent and manipulate numerical information, but they seldom generalize well outside of the range of numerical values encountered during training.

Model-Free Episodic Control

3 code implementations14 Jun 2016 Charles Blundell, Benigno Uria, Alexander Pritzel, Yazhe Li, Avraham Ruderman, Joel Z. Leibo, Jack Rae, Daan Wierstra, Demis Hassabis

State of the art deep reinforcement learning algorithms take many millions of interactions to attain human-level performance.

Decision Making Hippocampus +1

Cannot find the paper you are looking for? You can Submit a new open access paper.