no code implementations • 20 Dec 2023 • Jonathan Brokman, Roy Betser, Rotem Turjeman, Tom Berkov, Ido Cohen, Guy Gilboa
Our modeling can improve training efficiency and lower communication overhead, as shown by our preliminary experiments in the context of federated learning.
no code implementations • 18 Dec 2022 • Rotem Turjeman, Tom Berkov, Ido Cohen, Guy Gilboa
We propose in this work a model based on the correlation of the parameters' dynamics, which dramatically reduces the dimensionality.
no code implementations • CVPR 2023 • Or Streicher, Ido Cohen, Guy Gilboa
We analyze the degrees of freedom of learning this task using batches and propose a stable alignment mechanism that can work both with batch changes and with graph-metric changes.
1 code implementation • 7 Oct 2022 • Adi Haviv, Ido Cohen, Jacob Gidron, Roei Schuster, Yoav Goldberg, Mor Geva
In this work, we offer the first methodological framework for probing and characterizing recall of memorized sequences in transformer LMs.
no code implementations • 23 Aug 2022 • Ido Cohen, Dan Valsky, Ronen Talmon
Compared to a competing supervised algorithm based on a Hidden Markov Model, our unsupervised method demonstrates similar results in the STN detection task and superior results in the DLOR detection task.
no code implementations • 3 Jul 2020 • Ido Cohen, Omri Azencot, Pavel Lifshitz, Guy Gilboa
Definitions for spectrum and filtering are given, and a Parseval-type identity is shown.
Dynamical Systems Computational Engineering, Finance, and Science
no code implementations • 30 Nov 2019 • Ido Cohen, Eli David, Nathan S. Netanyahu
In recent years, large datasets of high-resolution mammalian neural images have become available, which has prompted active research on the analysis of gene expression data.
no code implementations • 27 Nov 2017 • Ido Cohen, Eli David, Nathan S. Netanyahu, Noa Liscovitch, Gal Chechik
This paper presents a novel deep learning-based method for learning a functional representation of mammalian neural images.