Search Results for author: Róbert Csordás

Found 3 papers, 3 papers with code

Going Beyond Linear Transformers with Recurrent Fast Weight Programmers

3 code implementations11 Jun 2021 Kazuki Irie, Imanol Schlag, Róbert Csordás, Jürgen Schmidhuber

Transformers with linearised attention ("linear Transformers") have demonstrated the practical scalability and effectiveness of outer product-based Fast Weight Programmers (FWPs) from the '90s.

Atari Games

Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks

1 code implementation ICLR 2021 Róbert Csordás, Sjoerd van Steenkiste, Jürgen Schmidhuber

Neural networks (NNs) whose subnetworks implement reusable functions are expected to offer numerous advantages, including compositionality through efficient recombination of functional building blocks, interpretability, preventing catastrophic interference, etc.

Systematic Generalization

Cannot find the paper you are looking for? You can Submit a new open access paper.