no code implementations • 6 Jul 2023 • Roni Rabin, Alexandre Djerbetian, Roee Engelberg, Lidan Hackmon, Gal Elidan, Reut Tsarfaty, Amir Globerson
Human communication often involves information gaps between the interlocutors.
no code implementations • 31 May 2023 • Paul Roit, Johan Ferret, Lior Shani, Roee Aharoni, Geoffrey Cideron, Robert Dadashi, Matthieu Geist, Sertan Girgin, Léonard Hussenot, Orgad Keller, Nikola Momchev, Sabela Ramos, Piotr Stanczyk, Nino Vieillard, Olivier Bachem, Gal Elidan, Avinatan Hassidim, Olivier Pietquin, Idan Szpektor
Despite the seeming success of contemporary grounded text generation systems, they often tend to generate factually inconsistent text with respect to their input.
Abstractive Text Summarization Natural Language Inference +2
no code implementations • 25 Jul 2022 • Deborah Cohen, MoonKyung Ryu, Yinlam Chow, Orgad Keller, Ido Greenberg, Avinatan Hassidim, Michael Fink, Yossi Matias, Idan Szpektor, Craig Boutilier, Gal Elidan
Despite recent advances in natural language understanding and generation, and decades of research on the development of conversational bots, building automated agents that can carry on rich open-ended conversations with humans "in the wild" remains a formidable challenge.
no code implementations • 10 Apr 2022 • Gal Yona, Shay Moran, Gal Elidan, Amir Globerson
We show that there is a natural class where this approach is sub-optimal, and that there is a more comparison-efficient active learning scheme.
no code implementations • 4 Nov 2021 • Sella Nevo, Efrat Morin, Adi Gerzi Rosenthal, Asher Metzger, Chen Barshai, Dana Weitzner, Dafi Voloshin, Frederik Kratzert, Gal Elidan, Gideon Dror, Gregory Begelman, Grey Nearing, Guy Shalev, Hila Noga, Ira Shavitt, Liora Yuklea, Moriah Royz, Niv Giladi, Nofar Peled Levi, Ofir Reich, Oren Gilon, Ronnie Maor, Shahar Timnat, Tal Shechter, Vladimir Anisimov, Yotam Gigi, Yuval Levin, Zach Moshe, Zvika Ben-Haim, Avinatan Hassidim, Yossi Matias
During the 2021 monsoon season, the flood warning system was operational in India and Bangladesh, covering flood-prone regions around rivers with a total area of 287, 000 km2, home to more than 350M people.
no code implementations • 5 May 2021 • Yaron Shoham, Gal Elidan
Despite seminal advances in reinforcement learning in recent years, many domains where the rewards are sparse, e. g. given only at task completion, remain quite challenging.
2 code implementations • ICCV 2021 • Oran Lang, Yossi Gandelsman, Michal Yarom, Yoav Wald, Gal Elidan, Avinatan Hassidim, William T. Freeman, Phillip Isola, Amir Globerson, Michal Irani, Inbar Mosseri
A natural source for such attributes is the StyleSpace of StyleGAN, which is known to generate semantically meaningful dimensions in the image.
no code implementations • ICLR 2021 • Liran Katzir, Gal Elidan, Ran El-Yaniv
A challenging open question in deep learning is how to handle tabular data.
no code implementations • 29 Nov 2020 • Sella Nevo, Gal Elidan, Avinatan Hassidim, Guy Shalev, Oren Gilon, Grey Nearing, Yossi Matias
Floods are among the most common and deadly natural disasters in the world, and flood warning systems have been shown to be effective in reducing harm.
no code implementations • 1 Jul 2020 • Zach Moshe, Asher Metzger, Gal Elidan, Frederik Kratzert, Sella Nevo, Ran El-Yaniv
In this work we present a novel family of hydrologic models, called HydroNets, which leverages river network structure.
no code implementations • 11 Jun 2020 • Ami Abutbul, Gal Elidan, Liran Katzir, Ran El-Yaniv
A challenging open question in deep learning is how to handle tabular data.
no code implementations • 21 Apr 2020 • Yonatan Woodbridge, Gal Elidan, Ami Wiesel
Quantifying uncertainty in predictions or, more generally, estimating the posterior conditional distribution, is a core challenge in machine learning and statistics.
1 code implementation • NeurIPS 2019 • Yoav Wald, Nofar Noy, Gal Elidan, Ami Wiesel
The core of the difficulty is the non-convexity of the objective function, implying that standard optimization algorithms may converge to sub-optimal critical points.
no code implementations • 3 Nov 2019 • Shai Rozenberg, Gal Elidan, Ran El-Yaniv
Given a deep neural network (DNN) for a classification problem, an application of MAD optimization results in MadNet, a version of the original network, now equipped with an adversarial defense mechanism.
no code implementations • 27 Oct 2019 • Yotam Gigi, Ami Wiesel, Sella Nevo, Gal Elidan, Avinatan Hassidim, Yossi Matias
In this scenario sharing a low-rank component between the tasks translates to a shared spectral reflection of the water, which is a true underlying physical model.
no code implementations • 25 Sep 2019 • Shai Rozenberg, Gal Elidan, Ran El-Yaniv
This paper is concerned with the defense of deep models against adversarial at- tacks.
no code implementations • 28 Jan 2019 • Sella Nevo, Vova Anisimov, Gal Elidan, Ran El-Yaniv, Pete Giencke, Yotam Gigi, Avinatan Hassidim, Zach Moshe, Mor Schlesinger, Guy Shalev, Ajai Tirumali, Ami Wiesel, Oleg Zlydenko, Yossi Matias
We propose to build on these strengths and develop ML systems for timely and accurate riverine flood prediction.
no code implementations • 3 Jan 2019 • Yotam Gigi, Gal Elidan, Avinatan Hassidim, Yossi Matias, Zach Moshe, Sella Nevo, Guy Shalev, Ami Wiesel
We demonstrate the efficacy of our approach for the problem of discharge estimation using simulations.
no code implementations • 8 Mar 2018 • Deborah Cohen, Amit Daniely, Amir Globerson, Gal Elidan
Complex classifiers may exhibit "embarassing" failures in cases where humans can easily provide a justified classification.
2 code implementations • 16 Aug 2016 • Elad ET. Eban, Mariano Schain, Alan Mackey, Ariel Gordon, Rif A. Saurous, Gal Elidan
Modern retrieval systems are often driven by an underlying machine learning model.
no code implementations • 26 Sep 2013 • Yaniv Tenzer, Gal Elidan
We tackle the challenge of efficiently learning the structure of expressive multivariate real-valued densities of copula graphical models.
no code implementations • 26 Sep 2013 • Ofer Meshi, Elad Eban, Gal Elidan, Amir Globerson
We demonstrate the effectiveness of our approach on several domains and show that, despite the relative simplicity of the structure, prediction accuracy is competitive with a fully connected model that is computationally costly at prediction time.
no code implementations • NeurIPS 2012 • Gal Elidan, Cobi Cario
Importantly, the method is as efficient as standard Gaussian BP, and its convergence properties do not depend on the complexity of the univariate marginals, even when a nonparametric representation is used.
no code implementations • NeurIPS 2010 • Gal Elidan
We present the Copula Bayesian Network model for representing multivariate continuous distributions.
no code implementations • NeurIPS 2008 • Gal Elidan, Stephen Gould
In this work we present a novel method for learning Bayesian networks of bounded treewidth that employs global structure modifications and that is polynomial in the size of the graph and the treewidth bound.