no code implementations • NeurIPS 2021 • Kyle Aitken, Vinay V Ramasesh, Yuan Cao, Niru Maheswaranathan
Moreover, how these mechanisms vary depending on the particular architecture used for the encoder and decoder (recurrent, feed-forward, etc.)
1 code implementation • ICLR 2021 • Kyle Aitken, Vinay V. Ramasesh, Ankush Garg, Yuan Cao, David Sussillo, Niru Maheswaranathan
Using tools from dynamical systems analysis, we study recurrent networks trained on a battery of both natural and synthetic text classification tasks.
no code implementations • 11 Jun 2020 • Kyle Aitken, Guy Gur-Ari
We consider an existing conjecture addressing the asymptotic behavior of neural networks in the large width limit.