no code implementations • 7 Feb 2024 • Itay Lavie, Guy Gur-Ari, Zohar Ringel
We study inductive bias in Transformers in the infinitely over-parameterized Gaussian process limit and argue transformers tend to be biased towards more permutation symmetric functions in sequence space.
no code implementations • 5 Oct 2023 • Noa Rubin, Inbar Seroussi, Zohar Ringel
A key property of deep neural networks (DNNs) is their ability to learn new features during training.
no code implementations • 27 Jul 2023 • Inbar Seroussi, Alexander A. Alemi, Moritz Helias, Zohar Ringel
State-of-the-art neural networks require extreme computational power to train.
no code implementations • 12 Jul 2023 • Inbar Seroussi, Asaf Miron, Zohar Ringel
Physically informed neural networks (PINNs) are a promising emerging method for solving differential equations.
no code implementations • 31 Dec 2021 • Inbar Seroussi, Gadi Naveh, Zohar Ringel
Deep neural networks (DNNs) are powerful tools for compressing and distilling information.
no code implementations • NeurIPS 2021 • Gadi Naveh, Zohar Ringel
Deep neural networks (DNNs) in the infinite width/channel limit have received much attention recently, as they provide a clear analytical window to deep learning via mappings to Gaussian Processes (GPs).
no code implementations • 28 Sep 2020 • Gadi Naveh, Oded Ben-David, Haim Sompolinsky, Zohar Ringel
A recent line of works studied wide deep neural networks (DNNs) by approximating them as Gaussian Processes (GPs).
no code implementations • 2 Apr 2020 • Gadi Naveh, Oded Ben-David, Haim Sompolinsky, Zohar Ringel
A recent line of works studied wide deep neural networks (DNNs) by approximating them as Gaussian Processes (GPs).
no code implementations • 25 Sep 2019 • Omry Cohen, Or Malka, Zohar Ringel
A series of recent works established a rigorous correspondence between very wide deep neural networks (DNNs), trained in a particular manner, and noiseless Bayesian Inference with a certain Gaussian Process (GP) known as the Neural Tangent Kernel (NTK).
no code implementations • 12 Jun 2019 • Omry Cohen, Or Malka, Zohar Ringel
In the past decade, deep neural networks (DNNs) came to the fore as the leading machine learning algorithms for a variety of tasks.
no code implementations • 6 Feb 2019 • Oded Ben-David, Zohar Ringel
Leveraging this correspondence, we derive the Deep Gaussian Layer-wise loss functions (DGLs) which, we believe, are the first supervised layer-wise loss functions which are both explicit and competitive in terms of accuracy.
no code implementations • ICLR 2018 • Zohar Ringel, Rodrigo de Bem
In this paper we approach two relevant deep learning topics: i) tackling of graph structured input data and ii) a better understanding and analysis of deep networks and related learning algorithms.
no code implementations • 20 Apr 2017 • Maciej Koch-Janusz, Zohar Ringel
Physical systems differring in their microscopic details often display strikingly similar behaviour when probed at macroscopic scales.
Disordered Systems and Neural Networks Statistical Mechanics Information Theory Machine Learning Information Theory Machine Learning