no code implementations • 17 Nov 2022 • Amit Daniely, Elad Granot
We investigate the sample complexity of bounded two-layer neural networks using different activation functions.
no code implementations • 20 May 2021 • Amit Daniely, Elad Granot
In this work, we present a polynomial-time algorithm that can learn a depth-two ReLU network from queries under mild general position assumptions.
no code implementations • NeurIPS 2019 • Amit Daniely, Elad Granot
We show that for any depth $t$, if the inputs are in $[-1, 1]^d$, the sample complexity of $H$ is $\tilde O\left(\frac{dR^2}{\epsilon^2}\right)$.