1 code implementation • NeurIPS 2023 • Abdulkadir Canatar, Jenelle Feather, Albert Wakhloo, SueYeon Chung
The representations of neural networks are often compared to those of biological systems by performing regression between the neural network responses and those measured from biological systems.
no code implementations • 14 Jun 2022 • Abdulkadir Canatar, Evan Peters, Cengiz Pehlevan, Stefan M. Wild, Ruslan Shaydulin
Quantum computers are known to provide speedups over classical state-of-the-art machine learning methods in some specialized settings.
1 code implementation • NeurIPS 2021 • Abdulkadir Canatar, Blake Bordelon, Cengiz Pehlevan
Here, we study generalization in kernel regression when the training and test distributions are different using methods from statistical physics.
BIG-bench Machine Learning Out-of-Distribution Generalization +1
1 code implementation • NeurIPS 2021 • Jacob A. Zavatone-Veth, Abdulkadir Canatar, Benjamin S. Ruben, Cengiz Pehlevan
However, our theoretical understanding of how the learned hidden layer representations of finite networks differ from the fixed representations of infinite networks remains incomplete.
1 code implementation • 23 Jun 2020 • Abdulkadir Canatar, Blake Bordelon, Cengiz Pehlevan
We present applications of our theory to real and synthetic datasets, and for many kernels including those that arise from training deep neural networks in the infinite-width limit.
1 code implementation • ICML 2020 • Blake Bordelon, Abdulkadir Canatar, Cengiz Pehlevan
We derive analytical expressions for the generalization performance of kernel regression as a function of the number of training samples using theoretical methods from Gaussian processes and statistical physics.