Search Results for author: Margarita Osadchy

Found 12 papers, 3 papers with code

Masked Particle Modeling on Sets: Towards Self-Supervised High Energy Physics Foundation Models

1 code implementation24 Jan 2024 Lukas Heinrich, Tobias Golling, Michael Kagan, Samuel Klein, Matthew Leigh, Margarita Osadchy, John Andrew Raine

We propose masked particle modeling (MPM) as a self-supervised method for learning generic, transferable, and reusable representations on unordered sets of inputs for use in high energy physics (HEP) scientific data.

Self-Supervised Learning

Dataset Distillation Meets Provable Subset Selection

no code implementations16 Jul 2023 Murad Tukan, Alaa Maalouf, Margarita Osadchy

Deep learning has grown tremendously over recent years, yielding state-of-the-art results in various fields.

A Unified Approach to Coreset Learning

no code implementations4 Nov 2021 Alaa Maalouf, Gilad Eini, Ben Mussay, Dan Feldman, Margarita Osadchy

Our approach offers a new definition of coreset, which is a natural relaxation of the standard definition and aims at approximating the \emph{average} loss of the original data over the queries.

Network Pruning

Fuzzy Commitments Offer Insufficient Protection to Biometric Templates Produced by Deep Learning

no code implementations24 Dec 2020 Danny Keller, Margarita Osadchy, Orr Dunkelman

Even in the "hardest" settings (in which we take a reconstructed image from one system and use it in a different system, with different feature extraction process) the reconstructed image offers 50 to 120 times higher success rates than the system's FAR.

Reconstruction Attack

Data-Independent Structured Pruning of Neural Networks via Coresets

no code implementations19 Aug 2020 Ben Mussay, Daniel Feldman, Samson Zhou, Vladimir Braverman, Margarita Osadchy

Our method is based on the coreset framework and it approximates the output of a layer of neurons/filters by a coreset of neurons/filters in the previous layer and discards the rest.

Model Compression

LSHR-Net: a hardware-friendly solution for high-resolution computational imaging using a mixed-weights neural network

1 code implementation27 Apr 2020 Fangliang Bai, Jinchao Liu, Xiaojuan Liu, Margarita Osadchy, Chao Wang, Stuart J. Gibson

However, to date, there have been two major drawbacks: (1) the high-precision real-valued sensing patterns proposed in the majority of existing works can prove problematic when used with computational imaging hardware such as a digital micromirror sampling device and (2) the network structures for image reconstruction involve intensive computation, which is also not suitable for hardware deployment.

Image Reconstruction

Data-Independent Neural Pruning via Coresets

no code implementations ICLR 2020 Ben Mussay, Margarita Osadchy, Vladimir Braverman, Samson Zhou, Dan Feldman

We propose the first efficient, data-independent neural pruning algorithm with a provable trade-off between its compression rate and the approximation error for any future test sample.

Model Compression Network Pruning

Learning to Support: Exploiting Structure Information in Support Sets for One-Shot Learning

no code implementations22 Aug 2018 Jinchao Liu, Stuart J. Gibson, Margarita Osadchy

Our model features three novel components: First is a feed-forward embedding that takes random class support samples (after a customary CNN embedding) and transfers them to a better class representation in terms of a classification problem.

General Classification Meta-Learning +1

Dynamic Spectrum Matching with One-shot Learning

no code implementations23 Jun 2018 Jinchao Liu, Stuart J. Gibson, James Mills, Margarita Osadchy

The effectiveness of the classification using CNNs drops rapidly when only a small number of spectra per substance are available for training (which is a typical situation in real applications).

Binary Classification Classification +4

Latent Hinge-Minimax Risk Minimization for Inference from a Small Number of Training Samples

no code implementations4 Feb 2017 Dolev Raviv, Margarita Osadchy

Deep Learning (DL) methods show very good performance when trained on large, balanced data sets.

Transfer Learning

HoneyFaces: Increasing the Security and Privacy of Authentication Using Synthetic Facial Images

no code implementations11 Nov 2016 Mor Ohana, Orr Dunkelman, Stuart Gibson, Margarita Osadchy

We implemented the HoneyFaces system and tested it with a password file composed of 270 real users.

Cannot find the paper you are looking for? You can Submit a new open access paper.