no code implementations • ICLR 2019 • Guy Hacohen, Daphna Weinshall
Initially, we define the difficulty of a training image using transfer learning from some competitive "teacher" network trained on the Imagenet database, showing improvement in learning speed and final performance for both small and competitive networks, using the CIFAR-10 and the CIFAR-100 datasets.
no code implementations • 27 Aug 2023 • Noam Fluss, Guy Hacohen, Daphna Weinshall
Semi-Supervised Learning (SSL) is a framework that utilizes both labeled and unlabeled data to enhance model performance.
no code implementations • 27 Aug 2023 • Guy Hacohen, Daphna Weinshall
In the domain of semi-supervised learning (SSL), the conventional approach involves training a learner with a limited amount of labeled data alongside a substantial volume of unlabeled data, both drawn from the same underlying distribution.
1 code implementation • 23 May 2022 • Ofer Yehuda, Avihu Dekel, Guy Hacohen, Daphna Weinshall
Deep active learning aims to reduce the annotation cost for the training of deep models, which is notoriously data-hungry.
1 code implementation • 6 Feb 2022 • Guy Hacohen, Avihu Dekel, Daphna Weinshall
Investigating active learning, we focus on the relation between the number of labeled examples (budget size), and suitable querying strategies.
Ranked #1 on Active Learning on CIFAR10 (10,000)
1 code implementation • ACL 2022 • Leshem Choshen, Guy Hacohen, Daphna Weinshall, Omri Abend
These findings suggest that there is some mutual inductive bias that underlies these models' learning of linguistic phenomena.
no code implementations • NeurIPS 2021 • Guy Hacohen, Daphna Weinshall
Empirically, we show how the PC-bias streamlines the order of learning of both linear and non-linear networks, more prominently at earlier stages of learning.
no code implementations • ICML 2020 • Guy Hacohen, Leshem Choshen, Daphna Weinshall
We further show that this pattern of results reflects the interplay between the way neural networks learn benchmark datasets.
3 code implementations • 7 Apr 2019 • Guy Hacohen, Daphna Weinshall
We address challenge (i) using two methods: transfer learning from some competitive ``teacher" network, and bootstrapping.