no code implementations • 7 Sep 2024 • Tom Bekor, Niv Nayman, Lihi Zelnik-Manor
Data augmentation has become an integral part of deep learning, as it is known to improve the generalization capabilities of neural networks.
no code implementations • 17 Jul 2024 • Ofir Abramovich, Niv Nayman, Sharon Fogel, Inbal Lavi, Ron Litman, Shahar Tsiper, Royee Tichauer, Srikar Appalaraju, Shai Mazor, R. Manmatha
In recent years, notable advancements have been made in the domain of visual document understanding, with the prevailing architecture comprising a cascade of vision and language models.
no code implementations • 19 Apr 2022 • Niv Nayman, Avram Golbert, Asaf Noy, Tan Ping, Lihi Zelnik-Manor
Encouraged by the recent transferability results of self-supervised models, we propose a method that combines self-supervised and supervised pretraining to generate models with both high diversity and high accuracy, and as a result high transferability.
1 code implementation • 24 Oct 2021 • Niv Nayman, Yonathan Aflalo, Asaf Noy, Rong Jin, Lihi Zelnik-Manor
Practical use of neural networks often involves requirements on latency, energy and memory among others.
2 code implementations • 23 Feb 2021 • Niv Nayman, Yonathan Aflalo, Asaf Noy, Lihi Zelnik-Manor
Realistic use of neural networks often requires adhering to multiple constraints on latency, energy and memory among others.
Ranked #21 on Neural Architecture Search on ImageNet
1 code implementation • NeurIPS 2021 • Jian Tan, Niv Nayman, Mengchang Wang
These virtual points, along with the means and variances of their unknown function values estimated using the simple kernel of the first stage, are fitted to a more sophisticated kernel model in the second stage.
2 code implementations • NeurIPS 2019 • Niv Nayman, Asaf Noy, Tal Ridnik, Itamar Friedman, Rong Jin, Lihi Zelnik-Manor
This paper introduces a novel optimization method for differential neural architecture search, based on the theory of prediction with expert advice.
1 code implementation • 8 Apr 2019 • Asaf Noy, Niv Nayman, Tal Ridnik, Nadav Zamir, Sivan Doveh, Itamar Friedman, Raja Giryes, Lihi Zelnik-Manor
In this paper, we propose a differentiable search space that allows the annealing of architecture weights, while gradually pruning inferior operations.