1 code implementation • 6 Oct 2024 • Alon Beck, Noam Levi, Yohai Bar-Sinai
Importantly, in the vicinity of the transition, that is, for training sets that are almost separable from the origin, the model may overfit for arbitrarily long times before generalizing.
no code implementations • 25 Oct 2023 • Noam Levi, Alon Beck, Yohai Bar-Sinai
Grokking is the intriguing phenomenon where a model learns to generalize long after it has fit the training data.
no code implementations • 3 Apr 2023 • Theo Jules, Gal Brener, Tal Kachman, Noam Levi, Yohai Bar-Sinai
The training of neural networks is a complex, high-dimensional, non-convex and noisy optimization problem whose theoretical understanding is interesting both from an applicative perspective and for fundamental reasons.
2 code implementations • 11 Apr 2020 • Jiawei Zhuang, Dmitrii Kochkov, Yohai Bar-Sinai, Michael P. Brenner, Stephan Hoyer
The computational cost of fluid simulations increases rapidly with grid resolution.
Computational Physics Disordered Systems and Neural Networks Fluid Dynamics
3 code implementations • 15 Aug 2018 • Yohai Bar-Sinai, Stephan Hoyer, Jason Hickey, Michael P. Brenner
Many problems in theoretical physics are centered on representing the behavior of a physical theory at long wave lengths and slow frequencies by integrating out degrees of freedom which change rapidly in time and space.
Disordered Systems and Neural Networks Computational Physics
1 code implementation • Science Advances (to appear) 2019 • Jordan Hoffmann, Yohai Bar-Sinai, Lisa Lee, Jovana Andrejevic, Shruti Mishra, Shmuel M. Rubinstein, Chris H. Rycroft
Machine learning has gained widespread attention as a powerful tool to identify structure in complex, high-dimensional data.
Soft Condensed Matter