Search Results for author: Yohai Bar-Sinai

Found 6 papers, 4 papers with code

Grokking at the Edge of Linear Separability

1 code implementation6 Oct 2024 Alon Beck, Noam Levi, Yohai Bar-Sinai

Importantly, in the vicinity of the transition, that is, for training sets that are almost separable from the origin, the model may overfit for arbitrarily long times before generalizing.

Grokking in Linear Estimators -- A Solvable Model that Groks without Understanding

no code implementations25 Oct 2023 Noam Levi, Alon Beck, Yohai Bar-Sinai

Grokking is the intriguing phenomenon where a model learns to generalize long after it has fit the training data.

Memorization

Charting the Topography of the Neural Network Landscape with Thermal-Like Noise

no code implementations3 Apr 2023 Theo Jules, Gal Brener, Tal Kachman, Noam Levi, Yohai Bar-Sinai

The training of neural networks is a complex, high-dimensional, non-convex and noisy optimization problem whose theoretical understanding is interesting both from an applicative perspective and for fundamental reasons.

Learned discretizations for passive scalar advection in a 2-D turbulent flow

2 code implementations11 Apr 2020 Jiawei Zhuang, Dmitrii Kochkov, Yohai Bar-Sinai, Michael P. Brenner, Stephan Hoyer

The computational cost of fluid simulations increases rapidly with grid resolution.

Computational Physics Disordered Systems and Neural Networks Fluid Dynamics

Data-driven discretization: a method for systematic coarse graining of partial differential equations

3 code implementations15 Aug 2018 Yohai Bar-Sinai, Stephan Hoyer, Jason Hickey, Michael P. Brenner

Many problems in theoretical physics are centered on representing the behavior of a physical theory at long wave lengths and slow frequencies by integrating out degrees of freedom which change rapidly in time and space.

Disordered Systems and Neural Networks Computational Physics

Cannot find the paper you are looking for? You can Submit a new open access paper.