Search Results for author: Essam Sleiman

Found 3 papers, 2 papers with code

SlowFormer: Universal Adversarial Patch for Attack on Compute and Energy Efficiency of Inference Efficient Vision Transformers

1 code implementation4 Oct 2023 KL Navaneet, Soroush Abbasi Koohpayegani, Essam Sleiman, Hamed Pirsiavash

We show that such models can be vulnerable to a universal adversarial patch attack, where the attacker optimizes for a patch that when pasted on any image, can increase the compute and power consumption of the model.

Mitigating Negative Transfer in Multi-Task Learning with Exponential Moving Average Loss Weighting Strategies

no code implementations22 Nov 2022 Anish Lakkapragada, Essam Sleiman, Saimourya Surabhi, Dennis P. Wall

Multi-Task Learning (MTL) is a growing subject of interest in deep learning, due to its ability to train models more efficiently on multiple tasks compared to using a group of conventional single-task models.

Multi-Task Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.