no code implementations • 28 Nov 2023 • Xiong Wang, Inbar Seroussi, Fei Lu
Our tLSE method offers a straightforward approach for establishing the optimal minimax rate for models with either local or nonlocal dependency.
no code implementations • 5 Oct 2023 • Noa Rubin, Inbar Seroussi, Zohar Ringel
A key property of deep neural networks (DNNs) is their ability to learn new features during training.
no code implementations • 17 Aug 2023 • Elizabeth Collins-Woodfin, Courtney Paquette, Elliot Paquette, Inbar Seroussi
In addition to the deterministic equivalent, we introduce an SDE with a simplified diffusion coefficient (homogenized SGD) which allows us to analyze the dynamics of general statistics of SGD iterates.
no code implementations • 27 Jul 2023 • Inbar Seroussi, Alexander A. Alemi, Moritz Helias, Zohar Ringel
State-of-the-art neural networks require extreme computational power to train.
no code implementations • 12 Jul 2023 • Inbar Seroussi, Asaf Miron, Zohar Ringel
Physically informed neural networks (PINNs) are a promising emerging method for solving differential equations.
no code implementations • 31 Dec 2021 • Inbar Seroussi, Gadi Naveh, Zohar Ringel
Deep neural networks (DNNs) are powerful tools for compressing and distilling information.
no code implementations • 26 Mar 2021 • Inbar Seroussi, Ofer Zeitouni
We study in this paper lower bounds for the generalization error of models derived from multi-layer neural networks, in the regime where the size of the layers is commensurate with the number of samples in the training data.
no code implementations • 19 Oct 2020 • Clement Cosco, Inbar Seroussi, Ofer Zeitouni
We study the directed polymer model for general graphs (beyond $\mathbb Z^d$) and random walks.
Probability
1 code implementation • 11 Feb 2019 • Inbar Seroussi, Nir Levy, Elad Yom-Tov
Understanding the dynamics of infectious disease spread in a heterogeneous population is an important factor in designing control strategies.