Search Results for author: Noam Levi

Found 8 papers, 2 papers with code

Decoupled Weight Decay for Any $p$ Norm

1 code implementation16 Apr 2024 Nadav Joseph Outmezguine, Noam Levi

With the success of deep neural networks (NNs) in a variety of domains, the computational and storage requirements for training and deploying large NNs have become a bottleneck for further improvements.

Measuring Sharpness in Grokking

1 code implementation14 Feb 2024 Jack Miller, Patrick Gleeson, Charles O'Neill, Thang Bui, Noam Levi

Neural networks sometimes exhibit grokking, a phenomenon where perfect or near-perfect performance is achieved on a validation set well after the same performance has been obtained on the corresponding training set.

The Universal Statistical Structure and Scaling Laws of Chaos and Turbulence

no code implementations2 Nov 2023 Noam Levi, Yaron Oz

We show that from the RMT perspective, the turbulence Gram matrices lie in the same universality class as quantum chaotic rather than integrable systems, and the data exhibits power-law scalings in the bulk of its eigenvalues which are vastly different from uncorrelated classical chaos, random data, natural images.

Grokking in Linear Estimators -- A Solvable Model that Groks without Understanding

no code implementations25 Oct 2023 Noam Levi, Alon Beck, Yohai Bar-Sinai

Grokking is the intriguing phenomenon where a model learns to generalize long after it has fit the training data.

Memorization

The Underlying Scaling Laws and Universal Statistical Structure of Complex Datasets

no code implementations26 Jun 2023 Noam Levi, Yaron Oz

We study universal traits which emerge both in real-world complex datasets, as well as in artificially generated ones.

Charting the Topography of the Neural Network Landscape with Thermal-Like Noise

no code implementations3 Apr 2023 Theo Jules, Gal Brener, Tal Kachman, Noam Levi, Yohai Bar-Sinai

The training of neural networks is a complex, high-dimensional, non-convex and noisy optimization problem whose theoretical understanding is interesting both from an applicative perspective and for fundamental reasons.

Noise Injection Node Regularization for Robust Learning

no code implementations27 Oct 2022 Noam Levi, Itay M. Bloch, Marat Freytsis, Tomer Volansky

We introduce Noise Injection Node Regularization (NINR), a method of injecting structured noise into Deep Neural Networks (DNN) during the training stage, resulting in an emergent regularizing effect.

Noise Injection as a Probe of Deep Learning Dynamics

no code implementations24 Oct 2022 Noam Levi, Itay Bloch, Marat Freytsis, Tomer Volansky

We propose a new method to probe the learning mechanism of Deep Neural Networks (DNN) by perturbing the system using Noise Injection Nodes (NINs).

Cannot find the paper you are looking for? You can Submit a new open access paper.