1 code implementation • 24 Jan 2024 • Lukas Heinrich, Tobias Golling, Michael Kagan, Samuel Klein, Matthew Leigh, Margarita Osadchy, John Andrew Raine
We propose masked particle modeling (MPM) as a self-supervised method for learning generic, transferable, and reusable representations on unordered sets of inputs for use in high energy physics (HEP) scientific data.
no code implementations • 16 Jul 2023 • Murad Tukan, Alaa Maalouf, Margarita Osadchy
Deep learning has grown tremendously over recent years, yielding state-of-the-art results in various fields.
no code implementations • 4 Nov 2021 • Alaa Maalouf, Gilad Eini, Ben Mussay, Dan Feldman, Margarita Osadchy
Our approach offers a new definition of coreset, which is a natural relaxation of the standard definition and aims at approximating the \emph{average} loss of the original data over the queries.
no code implementations • 24 Dec 2020 • Danny Keller, Margarita Osadchy, Orr Dunkelman
Even in the "hardest" settings (in which we take a reconstructed image from one system and use it in a different system, with different feature extraction process) the reconstructed image offers 50 to 120 times higher success rates than the system's FAR.
no code implementations • 19 Aug 2020 • Ben Mussay, Daniel Feldman, Samson Zhou, Vladimir Braverman, Margarita Osadchy
Our method is based on the coreset framework and it approximates the output of a layer of neurons/filters by a coreset of neurons/filters in the previous layer and discards the rest.
1 code implementation • 27 Apr 2020 • Fangliang Bai, Jinchao Liu, Xiaojuan Liu, Margarita Osadchy, Chao Wang, Stuart J. Gibson
However, to date, there have been two major drawbacks: (1) the high-precision real-valued sensing patterns proposed in the majority of existing works can prove problematic when used with computational imaging hardware such as a digital micromirror sampling device and (2) the network structures for image reconstruction involve intensive computation, which is also not suitable for hardware deployment.
no code implementations • ICLR 2020 • Ben Mussay, Margarita Osadchy, Vladimir Braverman, Samson Zhou, Dan Feldman
We propose the first efficient, data-independent neural pruning algorithm with a provable trade-off between its compression rate and the approximation error for any future test sample.
no code implementations • 22 Aug 2018 • Jinchao Liu, Stuart J. Gibson, Margarita Osadchy
Our model features three novel components: First is a feed-forward embedding that takes random class support samples (after a customary CNN embedding) and transfers them to a better class representation in terms of a classification problem.
no code implementations • 23 Jun 2018 • Jinchao Liu, Stuart J. Gibson, James Mills, Margarita Osadchy
The effectiveness of the classification using CNNs drops rapidly when only a small number of spectra per substance are available for training (which is a typical situation in real applications).
1 code implementation • 18 Aug 2017 • Jinchao Liu, Margarita Osadchy, Lorna Ashton, Michael Foster, Christopher J. Solomon, Stuart J. Gibson
Machine learning methods have found many applications in Raman spectroscopy, especially for the identification of chemical species.
no code implementations • 4 Feb 2017 • Dolev Raviv, Margarita Osadchy
Deep Learning (DL) methods show very good performance when trained on large, balanced data sets.
no code implementations • 11 Nov 2016 • Mor Ohana, Orr Dunkelman, Stuart Gibson, Margarita Osadchy
We implemented the HoneyFaces system and tested it with a password file composed of 270 real users.