Search Results for author: Neta Shoham

Found 5 papers, 1 papers with code

Scaling Neural Tangent Kernels via Sketching and Random Features

1 code implementation NeurIPS 2021 Amir Zandieh, Insu Han, Haim Avron, Neta Shoham, Chaewon Kim, Jinwoo Shin

To accelerate learning with NTK, we design a near input-sparsity time approximation algorithm for NTK, by sketching the polynomial expansions of arc-cosine kernels: our sketch for the convolutional counterpart of NTK (CNTK) can transform any image using a linear runtime in the number of pixels.

regression

An Exploration into why Output Regularization Mitigates Label Noise

no code implementations26 Apr 2021 Neta Shoham, Tomer Avidor, Nadav Israel

In this work we aim at closing this gap by showing that losses, which incorporate an output regularization term, become symmetric as the regularization coefficient goes to infinity.

Random Features for the Neural Tangent Kernel

no code implementations3 Apr 2021 Insu Han, Haim Avron, Neta Shoham, Chaewon Kim, Jinwoo Shin

We combine random features of the arc-cosine kernels with a sketching-based algorithm which can run in linear with respect to both the number of data points and input dimension.

Experimental Design for Overparameterized Learning with Application to Single Shot Deep Active Learning

no code implementations27 Sep 2020 Neta Shoham, Haim Avron

Unfortunately, classical theory on optimal experimental design focuses on selecting examples in order to learn underparameterized (and thus, non-interpolative) models, while modern machine learning models such as deep neural networks are overparameterized, and oftentimes are trained to be interpolative.

Active Learning BIG-bench Machine Learning +1

Overcoming Forgetting in Federated Learning on Non-IID Data

no code implementations17 Oct 2019 Neta Shoham, Tomer Avidor, Aviv Keren, Nadav Israel, Daniel Benditkis, Liron Mor-Yosef, Itai Zeitak

Building on an analogy with Lifelong Learning, we adapt a solution for catastrophic forgetting to Federated Learning.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.