no code implementations • 27 Nov 2024 • Vishaal Udandarao, Nikhil Parthasarathy, Muhammad Ferjad Naeem, Talfan Evans, Samuel Albanie, Federico Tombari, Yongqin Xian, Alessio Tonioni, Olivier J. Hénaff
Knowledge distillation (KD) is the de facto standard for compressing large-scale models into smaller ones.
no code implementations • 25 Jun 2024 • Talfan Evans, Nikhil Parthasarathy, Hamza Merzic, Olivier J. Henaff
Multimodal contrastive objectives expose the dependencies between data and thus naturally yield criteria for measuring the joint learnability of a batch.
no code implementations • 21 Dec 2023 • Michael Kuoch, Chi-Ning Chou, Nikhil Parthasarathy, Joel Dapello, James J. DiCarlo, Haim Sompolinsky, SueYeon Chung
Recently, growth in our understanding of the computations performed in both biological and artificial neural networks has largely been driven by either low-level mechanistic studies or global normative approaches.
no code implementations • 18 Dec 2023 • Nikhil Parthasarathy, Olivier J. Hénaff, Eero P. Simoncelli
Finally, when the two-stage model is used as a fixed front-end for a deep network trained to perform object recognition, the resultant model (LCL-V2Net) is significantly better than standard end-to-end self-supervised, supervised, and adversarially-trained models in terms of generalization to out-of-distribution tasks and alignment with human behavior.
no code implementations • NeurIPS 2023 • Nikhil Parthasarathy, S. M. Ali Eslami, João Carreira, Olivier J. Hénaff
Humans learn powerful representations of objects and scenes by observing how they evolve over time.
no code implementations • 30 Sep 2022 • Skanda Koppula, Yazhe Li, Evan Shelhamer, Andrew Jaegle, Nikhil Parthasarathy, Relja Arandjelovic, João Carreira, Olivier Hénaff
Self-supervised methods have achieved remarkable success in transfer learning, often achieving the same or better accuracy than supervised pre-training.
no code implementations • 30 Jun 2020 • Nikhil Parthasarathy, Eero P. Simoncelli
These responses are processed by a second stage (analogous to cortical area V2) consisting of convolutional filters followed by half-wave rectification and pooling to generate V2 'complex cell' responses.
1 code implementation • 15 Jul 2019 • Reuben Feinman, Nikhil Parthasarathy
Normalizing Flows are a promising new class of algorithms for unsupervised learning based on maximum likelihood optimization with change of variables.
1 code implementation • NeurIPS 2017 • Nikhil Parthasarathy, Eleanor Batty, William Falcon, Thomas Rutten, Mohit Rajpal, E.J. Chichilnisky, Liam Paninski
Decoding sensory stimuli from neural signals can be used to reveal how we sense our physical environment, and is valuable for the design of brain-machine interfaces.