no code implementations • 6 Sep 2024 • Coby Penso, Jacob Goldberger
This study addresses the problem of calibrating network confidence while adapting a model that was originally trained on a source domain to a target domain using unlabeled samples from the target domain.
no code implementations • 9 Aug 2024 • Roy Hirsch, Jacob Goldberger
Medical imaging classifiers can achieve high predictive accuracy, but quantifying their uncertainty remains an unresolved challenge, which prevents their deployment in medical clinics.
1 code implementation • 2 Jun 2024 • Ori Ernst, Ori Shapira, Aviv Slobodkin, Sharon Adar, Mohit Bansal, Jacob Goldberger, Ran Levy, Ido Dagan
Multi-document summarization (MDS) is a challenging task, often decomposed to subtasks of salience and redundancy detection, followed by text generation.
1 code implementation • 4 May 2024 • Coby Penso, Jacob Goldberger
We introduce a conformal score that is robust to label noise.
no code implementations • 3 Jan 2024 • Idit Diamant, Amir Rosenfeld, Idan Achituve, Jacob Goldberger, Arnon Netzer
In this paper, we introduce a novel noise-learning approach tailored to address noise distribution in domain adaptation settings and learn to de-confuse the pseudo-labels.
1 code implementation • 24 May 2023 • Avi Caciularu, Matthew E. Peters, Jacob Goldberger, Ido Dagan, Arman Cohan
The integration of multi-document pre-training objectives into language models has resulted in remarkable improvements in multi-document downstream tasks.
no code implementations • 4 May 2023 • Shauli Ravfogel, Yoav Goldberg, Jacob Goldberger
Language models generate text based on successively sampling the next word.
2 code implementations • NAACL 2022 • Ori Ernst, Avi Caciularu, Ori Shapira, Ramakanth Pasunuru, Mohit Bansal, Jacob Goldberger, Ido Dagan
Text clustering methods were traditionally incorporated into multi-document summarization (MDS) as a means for coping with considerable information repetition.
2 code implementations • NAACL 2022 • Avi Caciularu, Ido Dagan, Jacob Goldberger, Arman Cohan
Long-context question answering (QA) tasks require reasoning over a long document or multiple documents.
no code implementations • 29 Sep 2021 • Lior Frenkel, Jacob Goldberger
We derive an optimization method that is based on a closed form solution for the optimal weight scaling in each bin of a discretized value of the prediction confidence.
no code implementations • Joint Conference on Lexical and Computational Semantics 2021 • Avi Caciularu, Ido Dagan, Jacob Goldberger
We introduce a new approach for smoothing and improving the quality of word embeddings.
no code implementations • 11 Feb 2021 • Shlomo E. Chazan, Jacob Goldberger, Sharon Gannot
The experts estimate a mask from the noisy input and the final mask is then obtained as a weighted average of the experts' estimates, with the weights determined by the gating DNN.
no code implementations • 1 Jan 2021 • Avi Caciularu, Jacob Goldberger
In this study we propose a deep clustering algorithm that utilizes variational auto encoder (VAE) framework with a multi encoder-decoder neural architecture.
no code implementations • 1 Jan 2021 • Nir Raviv, Avi Caciularu, Tomer Raviv, Jacob Goldberger, Yair Be'ery
Error correction codes are an integral part of communication applications and boost the reliability of transmission.
no code implementations • 1 Jan 2021 • Hodaya Hammer, Shlomo Chazan, Jacob Goldberger, Sharon Gannot
In this study we present a deep neural network-based online multi-speaker localisation algorithm based on a multi-microphone array.
no code implementations • 7 Dec 2020 • Yael Ben-Guigui, Jacob Goldberger, Tammy Riklin-Raviv
The pressing need to reduce the capacity of deep neural networks has stimulated the development of network dilution methods and their analysis.
no code implementations • COLING 2020 • Soham Dan, Hagai Taitelbaum, Jacob Goldberger
We propose a natural extension of the PA algorithm that uses multiple orthogonal translation matrices to model the mapping and derive an algorithm to learn these multiple matrices.
1 code implementation • EMNLP (BlackboxNLP) 2020 • Shauli Ravfogel, Yanai Elazar, Jacob Goldberger, Yoav Goldberg
Contextualized word representations, such as ELMo and BERT, were shown to perform well on various semantic and syntactic tasks.
1 code implementation • CoNLL (EMNLP) 2021 • Ori Ernst, Ori Shapira, Ramakanth Pasunuru, Michael Lepioshkin, Jacob Goldberger, Mohit Bansal, Ido Dagan
Aligning sentences in a reference summary with their counterparts in source documents was shown as a useful auxiliary summarization task, notably for generating training data for salience detection.
1 code implementation • 26 Aug 2020 • Hodaya Hammer, Shlomo E. Chazan, Jacob Goldberger, Sharon Gannot
In this paper, we present a deep neural network-based online multi-speaker localisation algorithm.
no code implementations • 6 Feb 2020 • Nir Raviv, Avi Caciularu, Tomer Raviv, Jacob Goldberger, Yair Be'ery
Error correction codes are an integral part of communication applications, boosting the reliability of transmission.
no code implementations • IJCNLP 2019 • Hagai Taitelbaum, Gal Chechik, Jacob Goldberger
In this paper we present a novel approach to simultaneously representing multiple languages in a common space.
no code implementations • IJCNLP 2019 • Hagai Taitelbaum, Gal Chechik, Jacob Goldberger
For each source word, we first search for the most relevant auxiliary languages.
no code implementations • 26 Oct 2019 • Eytan Kats, Jacob Goldberger, Hayit Greenspan
Supervised machine learning algorithms, especially in the medical domain, are affected by considerable ambiguity in expert markings.
1 code implementation • NAACL 2019 • Noa Yehezkel Lubin, Jacob Goldberger, Yoav Goldberg
The algorithm jointly learns the noise level in the lexicon, finds the set of noisy pairs, and learns the mapping between the spaces.
no code implementations • 29 Apr 2019 • Ran Bakalo, Jacob Goldberger, Rami Ben-Ari
We show that the time consuming local annotations involved in supervised learning can be addressed by a weakly supervised method that can leverage a subset of locally annotated data.
no code implementations • 28 Apr 2019 • Ran Bakalo, Rami Ben-Ari, Jacob Goldberger
The high cost of generating expert annotations, poses a strong limitation for supervised machine learning methods in medical imaging.
5 code implementations • CVPR 2019 • Eran Goldman, Roei Herzig, Aviv Eisenschtat, Oria Ratzon, Itsik Levi, Jacob Goldberger, Tal Hassner
We propose a novel, deep-learning based method for precise object detection, designed for such challenging settings.
Ranked #4 on Dense Object Detection on SKU-110K
1 code implementation • 25 Mar 2019 • Noa Yehezkel Lubin, Jacob Goldberger, Yoav Goldberg
The algorithm jointly learns the noise level in the lexicon, finds the set of noisy pairs, and learns the mapping between the spaces.
no code implementations • 26 Jan 2019 • Eytan Kats, Jacob Goldberger, Hayit Greenspan
Detection and segmentation of MS lesions is a complex task largely due to the extreme unbalanced data, with very small number of lesion pixels that can be used for training.
no code implementations • 16 Dec 2018 • Shlomo E. Chazan, Sharon Gannot, Jacob Goldberger
The optimal clustering is found by minimizing the reconstruction loss of the mixture of autoencoder network.
no code implementations • COLING 2018 • Jacob Goldberger, Oren Melamud
Self-normalizing discriminative models approximate the normalized probability of a class without having to compute the partition function.
no code implementations • 19 Mar 2018 • Yaniv Shachor, Hayit Greenspan, Jacob Goldberger
We present a decision concept which explicitly takes into account the input multi-view structure, where for each case there is a different subset of relevant views.
no code implementations • 3 Mar 2018 • Maayan Frid-Adar, Idit Diamant, Eyal Klang, Michal Amitai, Jacob Goldberger, Hayit Greenspan
Then we present a novel scheme for liver lesion classification using CNN.
no code implementations • 8 Jan 2018 • Maayan Frid-Adar, Eyal Klang, Michal Amitai, Jacob Goldberger, Hayit Greenspan
In this paper, we present a data augmentation method that generates synthetic medical images using Generative Adversarial Networks (GANs).
no code implementations • 7 Jan 2018 • Avi Ben-Cohen, Eyal Klang, Michal Marianne Amitai, Jacob Goldberger, Hayit Greenspan
In this work we propose a method for anatomical data augmentation that is based on using slices of computed tomography (CT) examinations that are adjacent to labeled slices as another resource of labeled data for training the network.
no code implementations • 19 Jul 2017 • Maayan Frid-Adar, Idit Diamant, Eyal Klang, Michal Amitai, Jacob Goldberger, Hayit Greenspan
Automatic detection of liver lesions in CT images poses a great challenge for researchers.
no code implementations • EMNLP 2017 • Oren Melamud, Ido Dagan, Jacob Goldberger
Specifically, we show that with minor modifications to word2vec's algorithm, we get principled language models that are closely related to the well-established Noise Contrastive Estimation (NCE) based language models.
no code implementations • ACL 2017 • Oren Melamud, Jacob Goldberger
In this paper we define a measure of dependency between two random variables, based on the Jensen-Shannon (JS) divergence between their joint distribution and the product of their marginal distributions.
no code implementations • ICLR 2019 • Eran Goldman, Jacob Goldberger
This paper presents a novel deep learning architecture to classify structured objects in datasets with a large number of visually similar categories.
no code implementations • 6 Nov 2016 • Yehoshua Dissen, Joseph Keshet, Jacob Goldberger, Cynthia Clopper
We then freeze the parameters of the trained network and use several different datasets to train an adaptation layer that makes the obtained network universal in the sense that it works well for a variety of speakers and speech domains with very different characteristics.
no code implementations • 5 Sep 2016 • Oren Melamud, Ido Dagan, Jacob Goldberger
The obtained language modeling is closely related to NCE language models but is based on a simplified objective function.
1 code implementation • 1 Apr 2016 • Ehud Ben-Reuven, Jacob Goldberger
In this study we address the problem of training a neuralnetwork for language identification using both labeled and unlabeled speech samples in the form of i-vectors.
no code implementations • NeurIPS 2009 • Jacob Goldberger, Amir Leshem
The factor graph that corresponds to this problem is very loopy; in fact, it is a complete graph.
no code implementations • NeurIPS 2008 • Lev Faivishevsky, Jacob Goldberger
In this paper we introduce the MeanNN approach for estimation of main information theoretic measures such as differential entropy, mutual information and divergence.