no code implementations • 1 Jul 2022 • Jack Lindsey, Ashok Litwin-Kumar
The model provides a computational account for numerous experimental findings about dopamine activity that cannot be explained by classic models of reinforcement learning in the basal ganglia.
no code implementations • 2 May 2022 • Jack Lindsey, James B Aimone
In this work we develop a model of predictive learning on neuromorphic hardware.
no code implementations • NeurIPS 2020 • Jack Lindsey, Ashok Litwin-Kumar
Interest in biologically inspired alternatives to backpropagation is driven by the desire to both advance connections between deep learning and neuroscience and address backpropagation's shortcomings on tasks such as online, continual learning.
no code implementations • ICLR 2019 • Jack Lindsey, Samuel A. Ocko, Surya Ganguli, Stephane Deny
Neural representations vary drastically across the first stages of visual processing.
1 code implementation • 3 Jan 2019 • Jack Lindsey, Samuel A. Ocko, Surya Ganguli, Stephane Deny
The visual system is hierarchically organized to process visual information in successive stages.
no code implementations • NeurIPS 2018 • Samuel Ocko, Jack Lindsey, Surya Ganguli, Stephane Deny
Also, we train a nonlinear encoding model with a rectifying nonlinearity to efficiently encode naturalistic movies, and again find emergent receptive fields resembling those of midget and parasol cells that are now further subdivided into ON and OFF types.
no code implementations • 15 Dec 2017 • Jack Lindsey
Recurrent neural networks with differentiable attention mechanisms have had success in generative and classification tasks.