Search Results for author: Matthias H. Hennig

Found 5 papers, 2 papers with code

Targeted Neural Dynamical Modeling

2 code implementations NeurIPS 2021 Cole Hurwitz, Akash Srivastava, Kai Xu, Justin Jude, Matthew G. Perich, Lee E. Miller, Matthias H. Hennig

These approaches, however, are limited in their ability to capture the underlying neural dynamics (e. g. linear) and in their ability to relate the learned dynamics back to the observed behaviour (e. g. no time lag).

Hippocampal representations emerge when training recurrent neural networks on a memory dependent maze navigation task

no code implementations2 Dec 2020 Justin Jude, Matthias H. Hennig

We observe that once a recurrent network is trained to learn the structure of its environment solely based on sensory prediction, an attractor based landscape forms in the network's representation, which parallels hippocampal place cells in structure and function.

Hippocampus Q-Learning

Building population models for large-scale neural recordings: opportunities and pitfalls

no code implementations3 Feb 2021 Cole Hurwitz, Nina Kudryashova, Arno Onken, Matthias H. Hennig

Modern recording technologies now enable simultaneous recording from large numbers of neurons.

Capturing cross-session neural population variability through self-supervised identification of consistent neuron ensembles

no code implementations19 May 2022 Justin Jude, Matthew G. Perich, Lee E. Miller, Matthias H. Hennig

Classification of consistent versus unfamiliar neurons across sessions and accounting for deviations in the order of consistent recording neurons in recording datasets over sessions of recordings may then maintain decoding performance.

Cannot find the paper you are looking for? You can Submit a new open access paper.