Search Results for author: Elad Granot

Found 3 papers, 0 papers with code

On the Sample Complexity of Two-Layer Networks: Lipschitz vs. Element-Wise Lipschitz Activation

no code implementations17 Nov 2022 Amit Daniely, Elad Granot

We investigate the sample complexity of bounded two-layer neural networks using different activation functions.

An Exact Poly-Time Membership-Queries Algorithm for Extraction a three-Layer ReLU Network

no code implementations20 May 2021 Amit Daniely, Elad Granot

In this work, we present a polynomial-time algorithm that can learn a depth-two ReLU network from queries under mild general position assumptions.

BIG-bench Machine Learning Model extraction +1

Generalization Bounds for Neural Networks via Approximate Description Length

no code implementations NeurIPS 2019 Amit Daniely, Elad Granot

We show that for any depth $t$, if the inputs are in $[-1, 1]^d$, the sample complexity of $H$ is $\tilde O\left(\frac{dR^2}{\epsilon^2}\right)$.

Generalization Bounds

Cannot find the paper you are looking for? You can Submit a new open access paper.