Search Results for author: David Lipshutz

Found 16 papers, 7 papers with code

Modeling Neural Activity with Conditionally Linear Dynamical Systems

1 code implementation25 Feb 2025 Victor Geadah, Amin Nejatbakhsh, David Lipshutz, Jonathan W. Pillow, Alex H. Williams

Neural population activity exhibits complex, nonlinear dynamics, varying in time, over trials, and across experimental conditions.

Bayesian Inference

Comparing noisy neural population dynamics using optimal transport distances

no code implementations19 Dec 2024 Amin Nejatbakhsh, Victor Geadah, Alex H. Williams, David Lipshutz

Biological and artificial neural systems form high-dimensional neural representations that underpin their computational capabilities.

Gaussian Processes Image Generation

What Representational Similarity Measures Imply about Decodable Information

no code implementations12 Nov 2024 Sarah E. Harvey, David Lipshutz, Alex H. Williams

Neural responses encode information that is useful for a variety of downstream tasks.

Discriminating image representations with principal distortions

no code implementations20 Oct 2024 Jenelle Feather, David Lipshutz, Sarah E. Harvey, Alex H. Williams, Eero P. Simoncelli

This metric may then be used to optimally differentiate a set of models, by finding a pair of "principal distortions" that maximize the variance of the models under this metric.

Shaping the distribution of neural responses with interneurons in a recurrent circuit model

1 code implementation28 May 2024 David Lipshutz, Eero P. Simoncelli

The circuit, which is comprised of primary neurons that are recurrently connected to a set of local interneurons, continuously optimizes this objective by dynamically adjusting both the synaptic connections between neurons as well as the interneuron activation functions.

Neuronal Temporal Filters as Normal Mode Extractors

no code implementations6 Jan 2024 Siavash Golkar, Jules Berman, David Lipshutz, Robert Mihai Haret, Tim Gollisch, Dmitri B. Chklovskii

Such variation in the temporal filter with input SNR resembles that observed experimentally in biological neurons.

Time Series

Adaptive whitening with fast gain modulation and slow synaptic plasticity

1 code implementation NeurIPS 2023 Lyndon R. Duong, Eero P. Simoncelli, Dmitri B. Chklovskii, David Lipshutz

Neurons in early sensory areas rapidly adapt to changing sensory statistics, both by normalizing the variance of their individual responses and by reducing correlations between their responses.

Normative framework for deriving neural networks with multi-compartmental neurons and non-Hebbian plasticity

no code implementations20 Feb 2023 David Lipshutz, Yanis Bahroun, Siavash Golkar, Anirvan M. Sengupta, Dmitri B. Chklovskii

These NN models account for many anatomical and physiological observations; however, the objectives have limited computational power and the derived NNs do not explain multi-compartmental neuronal structures and non-Hebbian forms of plasticity that are prevalent throughout the brain.

Self-Supervised Learning

Adaptive whitening in neural populations with gain-modulating interneurons

1 code implementation27 Jan 2023 Lyndon R. Duong, David Lipshutz, David J. Heeger, Dmitri B. Chklovskii, Eero P. Simoncelli

Statistical whitening transformations play a fundamental role in many computational systems, and may also play an important role in biological sensory systems.

An online algorithm for contrastive Principal Component Analysis

no code implementations14 Nov 2022 Siavash Golkar, David Lipshutz, Tiberiu Tesileanu, Dmitri B. Chklovskii

However, the performance of cPCA is sensitive to hyper-parameter choice and there is currently no online algorithm for implementing cPCA.

Contrastive Learning

Interneurons accelerate learning dynamics in recurrent neural networks for statistical adaptation

no code implementations21 Sep 2022 David Lipshutz, Cengiz Pehlevan, Dmitri B. Chklovskii

To this end, we consider two mathematically tractable recurrent linear neural networks that statistically whiten their inputs -- one with direct recurrent connections and the other with interneurons that mediate recurrent communication.

A biologically plausible neural network for local supervision in cortical microcircuits

no code implementations30 Nov 2020 Siavash Golkar, David Lipshutz, Yanis Bahroun, Anirvan M. Sengupta, Dmitri B. Chklovskii

The backpropagation algorithm is an invaluable tool for training artificial neural networks; however, because of a weight sharing requirement, it does not provide a plausible model of brain function.

A simple normative network approximates local non-Hebbian learning in the cortex

no code implementations NeurIPS 2020 Siavash Golkar, David Lipshutz, Yanis Bahroun, Anirvan M. Sengupta, Dmitri B. Chklovskii

Here, adopting a normative approach, we model these instructive signals as supervisory inputs guiding the projection of the feedforward data.

Biologically plausible single-layer networks for nonnegative independent component analysis

1 code implementation23 Oct 2020 David Lipshutz, Cengiz Pehlevan, Dmitri B. Chklovskii

To model how the brain performs this task, we seek a biologically plausible single-layer neural network implementation of a blind source separation algorithm.

blind source separation

A biologically plausible neural network for Slow Feature Analysis

1 code implementation NeurIPS 2020 David Lipshutz, Charlie Windolf, Siavash Golkar, Dmitri B. Chklovskii

Furthermore, when trained on naturalistic stimuli, SFA reproduces interesting properties of cells in the primary visual cortex and hippocampus, suggesting that the brain uses temporal slowness as a computational principle for learning latent features.

Hippocampus Time Series +1

A biologically plausible neural network for multi-channel Canonical Correlation Analysis

1 code implementation1 Oct 2020 David Lipshutz, Yanis Bahroun, Siavash Golkar, Anirvan M. Sengupta, Dmitri B. Chklovskii

For biological plausibility, we require that the network operates in the online setting and its synaptic update rules are local.

Cannot find the paper you are looking for? You can Submit a new open access paper.