no code implementations • 21 Oct 2024 • Finn Schmidt, Polina Turishcheva, Suhas Shrinivasan, Fabian H. Sinz
We address this gap by proposing a probabilistic model that predicts the joint distribution of the neuronal responses from video stimuli and stimulus-independent latent factors.
2 code implementations • 12 Jul 2024 • Polina Turishcheva, Paul G. Fahey, Michaela Vystrčilová, Laura Hansel, Rachel Froebe, Kayla Ponder, Yongrong Qiu, Konstantin F. Willeke, Mohammad Bashiri, Ruslan Baikulov, Yu Zhu, Lei Ma, Shan Yu, Tiejun Huang, Bryan M. Li, Wolf De Wulf, Nina Kudryashova, Matthias H. Hennig, Nathalie L. Rochefort, Arno Onken, Eric Wang, Zhiwei Ding, Andreas S. Tolias, Fabian H. Sinz, Alexander S Ecker
To address this gap, we established the Sensorium 2023 Benchmark Competition with dynamic input, featuring a new large-scale dataset from the primary visual cortex of ten mice.
no code implementations • 18 Jun 2024 • Polina Turishcheva, Max Burg, Fabian H. Sinz, Alexander Ecker
Such weight vectors, which can be thought as embeddings of neuronal function, have been proposed to define functional cell types via unsupervised clustering.
1 code implementation • 10 Mar 2024 • Paweł A. Pierzchlewicz, Caio O. da Silva, R. James Cotton, Fabian H. Sinz
Current research has predominantly concentrated on generating multiple hypotheses for single frame static pose estimation or single hypothesis motion estimation.
3 code implementations • 31 May 2023 • Polina Turishcheva, Paul G. Fahey, Laura Hansel, Rachel Froebe, Kayla Ponder, Michaela Vystrčilová, Konstantin F. Willeke, Mohammad Bashiri, Eric Wang, Zhiwei Ding, Andreas S. Tolias, Fabian H. Sinz, Alexander S. Ecker
We hope this competition will continue to strengthen the accompanying Sensorium benchmarks collection as a standard tool to measure progress in large-scale neural system identification models of the entire mouse visual hierarchy and beyond.
no code implementations • 24 May 2023 • Arne F. Nix, Max F. Burg, Fabian H. Sinz
To improve these aspects of KD, we propose Hard Augmentations for Robust Distillation (HARD), a generally applicable data augmentation framework, that generates synthetic data points for which the teacher and the student disagree.
1 code implementation • 20 Oct 2022 • Paweł A. Pierzchlewicz, R. James Cotton, Mohammad Bashiri, Fabian H. Sinz
We evaluate cGNF on the Human~3. 6M dataset and show that cGNF provides a well-calibrated distribution estimate while being close to state-of-the-art in terms of overall minMPJPE.
Density Estimation
Multi-Hypotheses 3D Human Pose Estimation
3 code implementations • 17 Jun 2022 • Konstantin F. Willeke, Paul G. Fahey, Mohammad Bashiri, Laura Pede, Max F. Burg, Christoph Blessing, Santiago A. Cadena, Zhiwei Ding, Konstantin-Klemens Lurz, Kayla Ponder, Taliah Muhammad, Saumil S. Patel, Alexander S. Ecker, Andreas S. Tolias, Fabian H. Sinz
The neural underpinning of the biological visual system is challenging to study experimentally, in particular as the neuronal activity becomes increasingly nonlinear with respect to visual input.
1 code implementation • NeurIPS 2021 • Shahd Safarani, Arne Nix, Konstantin Willeke, Santiago A. Cadena, Kelli Restivo, George Denfield, Andreas S. Tolias, Fabian H. Sinz
We found that our co-trained network is more sensitive to content than noise when compared to a Baseline network that we trained for image classification alone.
no code implementations • ICLR 2021 • Konstantin-Klemens Lurz, Mohammad Bashiri, Konstantin Willeke, Akshay Jagadish, Eric Wang, Edgar Y. Walker, Santiago A Cadena, Taliah Muhammad, Erick Cobos, Andreas S. Tolias, Alexander S Ecker, Fabian H. Sinz
With this new readout we train our network on neural responses from mouse primary visual cortex (V1) and obtain a gain in performance of 7% compared to the previous state-of-the-art network.
1 code implementation • 22 Oct 2020 • R. James Cotton, Fabian H. Sinz, Andreas S. Tolias
We overcome this limitation by formulating the problem as $K$-shot prediction to directly infer a neuron's tuning function from a small set of stimulus-response pairs using a Neural Process.
no code implementations • ICLR 2020 • Ivan Ustyuzhaninov, Santiago A. Cadena, Emmanouil Froudarakis, Paul G. Fahey, Edgar Y. Walker, Erick Cobos, Jacob Reimer, Fabian H. Sinz, Andreas S. Tolias, Matthias Bethge, Alexander S. Ecker
Similar to a convolutional neural network (CNN), the mammalian retina encodes visual information into several dozen nonlinear feature maps, each formed by one ganglion cell type that tiles the visual space in an approximately shift-equivariant manner.
1 code implementation • NeurIPS 2019 • Zhe Li, Wieland Brendel, Edgar Y. Walker, Erick Cobos, Taliah Muhammad, Jacob Reimer, Matthias Bethge, Fabian H. Sinz, Xaq Pitkow, Andreas S. Tolias
We propose to regularize CNNs using large-scale neuroscience data to learn more robust neural features in terms of representational similarity.
no code implementations • NeurIPS Workshop Neuro_AI 2019 • Santiago A. Cadena, Fabian H. Sinz, Taliah Muhammad, Emmanouil Froudarakis, Erick Cobos, Edgar Y. Walker, Jake Reimer, Matthias Bethge, Andreas Tolias, Alexander S. Ecker
Recent work on modeling neural responses in the primate visual system has benefited from deep neural networks trained on large-scale object recognition, and found a hierarchical correspondence between layers of the artificial neural network and brain areas along the ventral visual stream.
1 code implementation • ICLR 2019 • Alexander S. Ecker, Fabian H. Sinz, Emmanouil Froudarakis, Paul G. Fahey, Santiago A. Cadena, Edgar Y. Walker, Erick Cobos, Jacob Reimer, Andreas S. Tolias, Matthias Bethge
We present a framework to identify common features independent of individual neurons' orientation selectivity by using a rotation-equivariant convolutional neural network, which automatically extracts every feature at multiple different orientations.
no code implementations • 14 Nov 2016 • Mengye Ren, Renjie Liao, Raquel Urtasun, Fabian H. Sinz, Richard S. Zemel
On the other hand, layer normalization normalizes the activations across all activities within a layer.
no code implementations • NeurIPS 2009 • Matthias Bethge, Eero P. Simoncelli, Fabian H. Sinz
We introduce a new family of distributions, called $L_p${\em -nested symmetric distributions}, whose densities access the data exclusively through a hierarchical cascade of $L_p$-norms.
no code implementations • NeurIPS 2008 • Fabian H. Sinz, Matthias Bethge
Bandpass filtering, orientation selectivity, and contrast gain control are prominent features of sensory coding at the level of V1 simple cells.