Search Results for author: Kanaka Rajan

Found 6 papers, 1 papers with code

full-FORCE: A Target-Based Method for Training Recurrent Networks

1 code implementation9 Oct 2017 Brian DePasquale, Christopher J. Cueva, Kanaka Rajan, G. Sean Escola, L. F. Abbott

We present a target-based method for modifying the full connectivity matrix of a recurrent network to train it to perform tasks involving temporally complex input/output transformations.

A brain basis of dynamical intelligence for AI and computational neuroscience

no code implementations15 May 2021 Joseph D. Monaco, Kanaka Rajan, Grace M. Hwang

To motivate a brain basis of neural computation, we present a dynamical view of intelligence from which we elaborate concepts of sparsity in network structure, temporal dynamics, and interactive learning.

Efficient and robust multi-task learning in the brain with modular latent primitives

no code implementations28 May 2021 Christian David Márton, Léo Gagnon, Guillaume Lajoie, Kanaka Rajan

For this reason, a central aspect of human learning is the ability to recycle previously acquired knowledge in a way that allows for faster, less resource-intensive acquisition of new skills.

Multi-Task Learning

Curriculum learning as a tool to uncover learning principles in the brain

no code implementations ICLR 2022 Daniel R. Kepple, Rainer Engelken, Kanaka Rajan

Using recurrent neural networks (RNNs) and models of common experimental neuroscience tasks, we demonstrate that curricula can be used to differentiate learning principles using target-based and a representation-based loss functions as use cases.

TRAKR – A reservoir-based tool for fast and accurate classification of neural time-series patterns

no code implementations29 Sep 2021 Muhammad Furqan Afzal, Christian D Marton, Erin L. Rich, Kanaka Rajan

Therefore, TRAKR can be used as a fast and accurate tool to distinguish patterns in complex nonlinear time-series data, such as neural recordings.

Time Series Time Series Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.