Search Results for author: Abdulkadir Canatar

Found 6 papers, 5 papers with code

A Spectral Theory of Neural Prediction and Alignment

1 code implementation NeurIPS 2023 Abdulkadir Canatar, Jenelle Feather, Albert Wakhloo, SueYeon Chung

The representations of neural networks are often compared to those of biological systems by performing regression between the neural network responses and those measured from biological systems.

regression

Bandwidth Enables Generalization in Quantum Kernel Models

no code implementations14 Jun 2022 Abdulkadir Canatar, Evan Peters, Cengiz Pehlevan, Stefan M. Wild, Ruslan Shaydulin

Quantum computers are known to provide speedups over classical state-of-the-art machine learning methods in some specialized settings.

Inductive Bias

Out-of-Distribution Generalization in Kernel Regression

1 code implementation NeurIPS 2021 Abdulkadir Canatar, Blake Bordelon, Cengiz Pehlevan

Here, we study generalization in kernel regression when the training and test distributions are different using methods from statistical physics.

BIG-bench Machine Learning Out-of-Distribution Generalization +1

Asymptotics of representation learning in finite Bayesian neural networks

1 code implementation NeurIPS 2021 Jacob A. Zavatone-Veth, Abdulkadir Canatar, Benjamin S. Ruben, Cengiz Pehlevan

However, our theoretical understanding of how the learned hidden layer representations of finite networks differ from the fixed representations of infinite networks remains incomplete.

Representation Learning

Spectral Bias and Task-Model Alignment Explain Generalization in Kernel Regression and Infinitely Wide Neural Networks

1 code implementation23 Jun 2020 Abdulkadir Canatar, Blake Bordelon, Cengiz Pehlevan

We present applications of our theory to real and synthetic datasets, and for many kernels including those that arise from training deep neural networks in the infinite-width limit.

BIG-bench Machine Learning Inductive Bias +1

Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural Networks

1 code implementation ICML 2020 Blake Bordelon, Abdulkadir Canatar, Cengiz Pehlevan

We derive analytical expressions for the generalization performance of kernel regression as a function of the number of training samples using theoretical methods from Gaussian processes and statistical physics.

Gaussian Processes regression

Cannot find the paper you are looking for? You can Submit a new open access paper.