1 code implementation • NeurIPS 2023 • Ido Ben-Shaul, Ravid Shwartz-Ziv, Tomer Galanti, Shai Dekel, Yann Lecun
Self-supervised learning (SSL) is a powerful tool in machine learning, but understanding the learned representations and their underlying mechanisms remains a challenge.
no code implementations • 11 Jan 2023 • Ido Ben-Shaul, Tomer Galanti, Shai Dekel
Multiplication layers are a key component in various influential neural network modules, including self-attention and hypernetwork layers.
no code implementations • 18 Feb 2022 • Tomer Galanti, Liane Galanti, Ido Ben-Shaul
Finally, we empirically show that the effective depth of a trained neural network monotonically increases when increasing the number of random labels in data.
no code implementations • 21 Jan 2022 • Ido Ben-Shaul, Shai Dekel
Recent advances in theoretical Deep Learning have introduced geometric properties that occur during training, past the Interpolation Threshold -- where the training error reaches zero.
no code implementations • 14 May 2021 • Ido Ben-Shaul, Shai Dekel
We propose a probe for the analysis of deep learning architectures that is based on machine learning and approximation theoretical principles.
no code implementations • 1 Jan 2021 • Ido Ben-Shaul, Leah Bar, Nir Sochen
Solving the eigenvalue problem for differential operators is a common problem in many scientific fields.
no code implementations • 24 Aug 2020 • Jacob Gildenblat, Ido Ben-Shaul, Zvi Lapp, Eldad Klaiman
The bag level class prediction is derived from the multiple instances through application of a permutation invariant pooling operator on instance predictions or embeddings.
no code implementations • 20 Jul 2020 • Ido Ben-Shaul, Leah Bar, Nir Sochen
In this work, we explore the ability of NN (Neural Networks) to serve as a tool for finding eigen-pairs of ordinary differential equations.