Search Results for author: Itamar Zimerman

Found 10 papers, 2 papers with code

The Hidden Attention of Mamba Models

1 code implementation3 Mar 2024 Ameen Ali, Itamar Zimerman, Lior Wolf

The Mamba layer offers an efficient selective state space model (SSM) that is highly effective in modeling multiple domains, including NLP, long-range sequence processing, and computer vision.

On the Long Range Abilities of Transformers

no code implementations28 Nov 2023 Itamar Zimerman, Lior Wolf

Despite their dominance in modern DL and, especially, NLP domains, transformer architectures exhibit sub-optimal performance on long-range tasks compared to recent layers that are specifically designed for this purpose.

Inductive Bias

Multi-Dimensional Hyena for Spatial Inductive Bias

no code implementations24 Sep 2023 Itamar Zimerman, Lior Wolf

Our empirical findings indicate that the proposed Hyena N-D layer boosts the performance of various Vision Transformer architectures, such as ViT, Swin, and DeiT across multiple datasets.

Inductive Bias

Efficient Skip Connections Realization for Secure Inference on Encrypted Data

no code implementations11 Jun 2023 Nir Drucker, Itamar Zimerman

Homomorphic Encryption (HE) is a cryptographic tool that allows performing computation under encryption, which is used by many privacy-preserving machine learning solutions, for example, to perform secure classification.

Privacy Preserving

2-D SSM: A General Spatial Layer for Visual Transformers

1 code implementation11 Jun 2023 Ethan Baron, Itamar Zimerman, Lior Wolf

For example, vision transformers equipped with our layer exhibit effective performance even without positional encoding

Inductive Bias Position

Decision S4: Efficient Sequence-Based RL via State Spaces Layers

no code implementations8 Jun 2023 Shmuel Bar-David, Itamar Zimerman, Eliya Nachmani, Lior Wolf

Recently, sequence learning methods have been applied to the problem of off-policy Reinforcement Learning, including the seminal work on Decision Transformers, which employs transformers for this task.

Focus Your Attention (with Adaptive IIR Filters)

no code implementations24 May 2023 Shahar Lutati, Itamar Zimerman, Lior Wolf

We present a new layer in which dynamic (i. e., input-dependent) Infinite Impulse Response (IIR) filters of order two are used to process the input sequence prior to applying conventional attention.

Long-range modeling

Training Large Scale Polynomial CNNs for E2E Inference over Homomorphic Encryption

no code implementations26 Apr 2023 Moran Baruch, Nir Drucker, Gilad Ezov, Yoav Goldberg, Eyal Kushnir, Jenny Lerner, Omri Soceanu, Itamar Zimerman

Training large-scale CNNs that during inference can be run under Homomorphic Encryption (HE) is challenging due to the need to use only polynomial operations.

Privacy Preserving Transfer Learning

Recovering AES Keys with a Deep Cold Boot Attack

no code implementations9 Jun 2021 Itamar Zimerman, Eliya Nachmani, Lior Wolf

In this work, we combine a novel cryptographic variant of a deep error correcting code technique with a modified SAT solver scheme to apply the attack on AES keys.

Cryptanalysis Scheduling

Cannot find the paper you are looking for? You can Submit a new open access paper.