Search Results for author: Daphna Weinshall

Found 32 papers, 9 papers with code

In Your Pace: Learning the Right Example at the Right Time

no code implementations ICLR 2019 Guy Hacohen, Daphna Weinshall

Initially, we define the difficulty of a training image using transfer learning from some competitive "teacher" network trained on the Imagenet database, showing improvement in learning speed and final performance for both small and competitive networks, using the CIFAR-10 and the CIFAR-100 datasets.

Transfer Learning

Relearning Forgotten Knowledge: on Forgetting, Overfit and Training-Free Ensembles of DNNs

no code implementations17 Oct 2023 Uri Stern, Daphna Weinshall

An extensive empirical evaluation with modern deep models shows our method's utility on multiple datasets, neural networks architectures and training schemes, both when training from scratch and when using pre-trained networks in transfer learning.

Image Classification Transfer Learning

United We Stand: Using Epoch-wise Agreement of Ensembles to Combat Overfit

1 code implementation17 Oct 2023 Uri Stern, Daniel Shwartz, Daphna Weinshall

Our method allows for the incorporation of useful knowledge obtained by the models during the overfitting phase without deterioration of the general performance, which is usually missed when early stopping is used.

Image Classification text-classification +1

Pruning the Unlabeled Data to Improve Semi-Supervised Learning

no code implementations27 Aug 2023 Guy Hacohen, Daphna Weinshall

In the domain of semi-supervised learning (SSL), the conventional approach involves training a learner with a limited amount of labeled data alongside a substantial volume of unlabeled data, both drawn from the same underlying distribution.

Image Classification

Semi-Supervised Learning in the Few-Shot Zero-Shot Scenario

no code implementations27 Aug 2023 Noam Fluss, Guy Hacohen, Daphna Weinshall

Semi-Supervised Learning (SSL) is a framework that utilizes both labeled and unlabeled data to enhance model performance.

Image Classification

The Dynamic of Consensus in Deep Networks and the Identification of Noisy Labels

no code implementations2 Oct 2022 Daniel Shwartz, Uri Stern, Daphna Weinshall

This introduces a problem when training in the presence of noisy labels, as the noisy examples cannot be distinguished from clean examples by the end of training.

Ensemble Learning

Active Learning Through a Covering Lens

1 code implementation23 May 2022 Ofer Yehuda, Avihu Dekel, Guy Hacohen, Daphna Weinshall

Deep active learning aims to reduce the annotation cost for the training of deep models, which is notoriously data-hungry.

Active Learning Representation Learning +1

Active Learning on a Budget: Opposite Strategies Suit High and Low Budgets

1 code implementation6 Feb 2022 Guy Hacohen, Avihu Dekel, Daphna Weinshall

Investigating active learning, we focus on the relation between the number of labeled examples (budget size), and suitable querying strategies.

Active Learning

The Grammar-Learning Trajectories of Neural Language Models

1 code implementation ACL 2022 Leshem Choshen, Guy Hacohen, Daphna Weinshall, Omri Abend

These findings suggest that there is some mutual inductive bias that underlies these models' learning of linguistic phenomena.

Inductive Bias

Principal Components Bias in Over-parameterized Linear Models, and its Manifestation in Deep Neural Networks

no code implementations NeurIPS 2021 Guy Hacohen, Daphna Weinshall

Empirically, we show how the PC-bias streamlines the order of learning of both linear and non-linear networks, more prominently at earlier stages of learning.

More Is More -- Narrowing the Generalization Gap by Adding Classification Heads

no code implementations9 Feb 2021 Roee Cates, Daphna Weinshall

Our model can be employed during training time only and then pruned for prediction, resulting in an equivalent architecture to the base model.

General Classification

Boosting the Performance of Semi-Supervised Learning with Unsupervised Clustering

1 code implementation1 Dec 2020 Boaz Lerner, Guy Shiran, Daphna Weinshall

We also notably improve the results in the extreme cases of 1, 2 and 3 labels per class, and show that features learned by our model are more meaningful for separating the data.

Clustering Semi-Supervised Image Classification

Multiclass non-Adversarial Image Synthesis, with Application to Classification from Very Small Sample

no code implementations25 Nov 2020 Itamar Winter, Daphna Weinshall

In the small-data regime, where only a small sample of labeled images is available for training with no access to additional unlabeled data, our results surpass state-of-the-art GAN models trained on the same amount of data.

CoLA General Classification +1

Multi-Modal Deep Clustering: Unsupervised Partitioning of Images

1 code implementation5 Dec 2019 Guy Shiran, Daphna Weinshall

Simultaneously, the same deep network is trained to solve an additional self-supervised task of predicting image rotations.

Clustering Deep Clustering +1

On The Power of Curriculum Learning in Training Deep Networks

3 code implementations7 Apr 2019 Guy Hacohen, Daphna Weinshall

We address challenge (i) using two methods: transfer learning from some competitive ``teacher" network, and bootstrapping.

Transfer Learning

Theory of Curriculum Learning, with Convex Loss Functions

no code implementations9 Dec 2018 Daphna Weinshall, Dan Amir

We also prove that when the ideal difficulty score is fixed, the convergence rate is monotonically increasing with respect to the loss of the current hypothesis at each point.

Binary Classification

Coming to Your Senses: on Controls and Evaluation Sets in Polysemy Research

no code implementations EMNLP 2018 Haim Dubossarsky, Eitan Grossman, Daphna Weinshall

This and additional results point to the conclusion that performance gains as reported in previous work may be an artifact of random sense assignment, which is equivalent to sub-sampling and multiple estimation of word vector representations.

Word Embeddings Word Similarity

Gaussian Mixture Generative Adversarial Networks for Diverse Datasets, and the Unsupervised Clustering of Images

no code implementations30 Aug 2018 Matan Ben-Yosef, Daphna Weinshall

In order to provide a better fit to the target data distribution when the dataset includes many different classes, we propose a variant of the basic GAN model, called Gaussian Mixture GAN (GM-GAN), where the probability distribution over the latent space is a mixture of Gaussians.

Clustering

Curriculum Learning by Transfer Learning: Theory and Experiments with Deep Networks

no code implementations ICML 2018 Daphna Weinshall, Gad Cohen, Dan Amir

We provide theoretical investigation of curriculum learning in the context of stochastic gradient descent when optimizing the convex linear regression loss.

Learning Theory Transfer Learning

Distance-based Confidence Score for Neural Network Classifiers

no code implementations28 Sep 2017 Amit Mandelbaum, Daphna Weinshall

The reliable measurement of confidence in classifiers' predictions is very important for many applications and is, therefore, an important part of classifier design.

Novelty Detection

Hidden Layers in Perceptual Learning

no code implementations CVPR 2017 Gad Cohen, Daphna Weinshall

Studies in visual perceptual learning investigate the way human performance improves with practice, in the context of relatively simple (and therefore more manageable) visual tasks.

Specificity Transfer Learning

Every Untrue Label is Untrue in its Own Way: Controlling Error Type with the Log Bilinear Loss

1 code implementation20 Apr 2017 Yehezkel S. Resheff, Amit Mandelbaum, Daphna Weinshall

Deep learning has become the method of choice in many application domains of machine learning in recent years, especially for multi-class classification tasks.

Multi-class Classification

Novelty Detection in MultiClass Scenarios with Incomplete Set of Class Labels

no code implementations21 Apr 2016 Nomi Vinokurov, Daphna Weinshall

Our method is based on the initial assignment of confidence values, which measure the affinity between a new test point and each known class.

Novelty Detection

Optimized Linear Imputation

no code implementations17 Nov 2015 Yehezkel S. Resheff, Daphna Weinshall

Since most data analysis and statistical methods do not handle gracefully missing values, the first step in the analysis requires the imputation of missing values.

Imputation regression

Topic Modeling of Behavioral Modes Using Sensor Data

no code implementations16 Nov 2015 Yehezkel S. Resheff, Shay Rotics, Ran Nathan, Daphna Weinshall

A common use of accelerometer data is for supervised learning of behavioral modes.

Clustering

A Cheap System for Vehicle Speed Detection

no code implementations27 Jan 2015 Chaim Ginzburg, Amit Raphael, Daphna Weinshall

The reliable detection of speed of moving vehicles is considered key to traffic law enforcement in most countries, and is seen by many as an important tool to reduce the number of traffic accidents and fatalities.

Online Learning in The Manifold of Low-Rank Matrices

no code implementations NeurIPS 2010 Uri Shalit, Daphna Weinshall, Gal Chechik

When learning models that are represented in matrix forms, enforcing a low-rank constraint can dramatically improve the memory and run time complexity, while providing a natural regularization of the model.

Multi-Label Image Classification

Beyond Novelty Detection: Incongruent Events, when General and Specific Classifiers Disagree

no code implementations NeurIPS 2008 Daphna Weinshall, Hynek Hermansky, Alon Zweig, Jie Luo, Holly Jimison, Frank Ohl, Misha Pavel

We define a formal framework for the representation and processing of incongruent events: starting from the notion of label hierarchy, we show how partial order on labels can be deduced from such hierarchies.

Novelty Detection Object Recognition +2

Cannot find the paper you are looking for? You can Submit a new open access paper.