Search Results for author: Motasem Alfarra

Found 22 papers, 12 papers with code

Combating Missing Modalities in Egocentric Videos at Test Time

no code implementations23 Apr 2024 Merey Ramazanova, Alejandro Pardo, Bernard Ghanem, Motasem Alfarra

Understanding videos that contain multiple modalities is crucial, especially in egocentric videos, where combining various sensory inputs significantly improves tasks like action recognition and moment localization.

Action Recognition Test-time Adaptation

Revisiting Test Time Adaptation under Online Evaluation

1 code implementation10 Apr 2023 Motasem Alfarra, Hani Itani, Alejandro Pardo, Shyma Alhuwaider, Merey Ramazanova, Juan C. Pérez, Zhipeng Cai, Matthias Müller, Bernard Ghanem

To address this issue, we propose a more realistic evaluation protocol for TTA methods, where data is received in an online fashion from a constant-speed data stream, thereby accounting for the method's adaptation speed.

Test-time Adaptation

Online Distillation with Continual Learning for Cyclic Domain Shifts

1 code implementation3 Apr 2023 Joachim Houyon, Anthony Cioppa, Yasir Ghunaim, Motasem Alfarra, Anaïs Halin, Maxim Henry, Bernard Ghanem, Marc Van Droogenbroeck

In this paper, we propose a solution to this issue by leveraging the power of continual learning methods to reduce the impact of domain shifts.

Autonomous Driving Continual Learning

Real-Time Evaluation in Online Continual Learning: A New Hope

1 code implementation CVPR 2023 Yasir Ghunaim, Adel Bibi, Kumail Alhamoud, Motasem Alfarra, Hasan Abed Al Kader Hammoud, Ameya Prabhu, Philip H. S. Torr, Bernard Ghanem

We show that a simple baseline outperforms state-of-the-art CL methods under this evaluation, questioning the applicability of existing methods in realistic settings.

Continual Learning

SimCS: Simulation for Domain Incremental Online Continual Segmentation

no code implementations29 Nov 2022 Motasem Alfarra, Zhipeng Cai, Adel Bibi, Bernard Ghanem, Matthias Müller

This work explores the problem of Online Domain-Incremental Continual Segmentation (ODICS), where the model is continually trained over batches of densely labeled images from different domains, with limited computation and no information about the task boundaries.

Autonomous Driving Continual Learning +2

Generalizability of Adversarial Robustness Under Distribution Shifts

no code implementations29 Sep 2022 Kumail Alhamoud, Hasan Abed Al Kader Hammoud, Motasem Alfarra, Bernard Ghanem

Recent progress in empirical and certified robustness promises to deliver reliable and deployable Deep Neural Networks (DNNs).

Adversarial Robustness Domain Generalization

Certified Robustness in Federated Learning

1 code implementation6 Jun 2022 Motasem Alfarra, Juan C. Pérez, Egor Shulgin, Peter Richtárik, Bernard Ghanem

However, as in the single-node supervised learning setup, models trained in federated learning suffer from vulnerability to imperceptible input transformations known as adversarial attacks, questioning their deployment in security-related applications.

Federated Learning

3DeformRS: Certifying Spatial Deformations on Point Clouds

1 code implementation CVPR 2022 Gabriel Pérez S., Juan C. Pérez, Motasem Alfarra, Silvio Giancola, Bernard Ghanem

In this work, we propose 3DeformRS, a method to certify the robustness of point cloud Deep Neural Networks (DNNs) against real-world deformations.

Autonomous Driving

Towards Assessing and Characterizing the Semantic Robustness of Face Recognition

no code implementations10 Feb 2022 Juan C. Pérez, Motasem Alfarra, Ali Thabet, Pablo Arbeláez, Bernard Ghanem

We propose a methodology for assessing and characterizing the robustness of FRMs against semantic perturbations to their input.

Face Recognition

On the Robustness of Quality Measures for GANs

1 code implementation31 Jan 2022 Motasem Alfarra, Juan C. Pérez, Anna Frühstück, Philip H. S. Torr, Peter Wonka, Bernard Ghanem

Finally, we show that the FID can be robustified by simply replacing the standard Inception with a robust Inception.

ANCER: Anisotropic Certification via Sample-wise Volume Maximization

1 code implementation9 Jul 2021 Francisco Eiras, Motasem Alfarra, M. Pawan Kumar, Philip H. S. Torr, Puneet K. Dokania, Bernard Ghanem, Adel Bibi

Randomized smoothing has recently emerged as an effective tool that enables certification of deep neural network classifiers at scale.

DeformRS: Certifying Input Deformations with Randomized Smoothing

2 code implementations2 Jul 2021 Motasem Alfarra, Adel Bibi, Naeemullah Khan, Philip H. S. Torr, Bernard Ghanem

Deep neural networks are vulnerable to input deformations in the form of vector fields of pixel displacements and to other parameterized geometric deformations e. g. translations, rotations, etc.

On the Decision Boundaries of Neural Networks. A Tropical Geometry Perspective

no code implementations1 Jan 2021 Motasem Alfarra, Adel Bibi, Hasan Abed Al Kader Hammoud, Mohamed Gaafar, Bernard Ghanem

This work tackles the problem of characterizing and understanding the decision boundaries of neural networks with piecewise linear non-linearity activations.

Network Pruning

Data-Dependent Randomized Smoothing

no code implementations8 Dec 2020 Motasem Alfarra, Adel Bibi, Philip H. S. Torr, Bernard Ghanem

In this work, we revisit Gaussian randomized smoothing and show that the variance of the Gaussian distribution can be optimized at each input so as to maximize the certification radius for the construction of the smooth classifier.

Rethinking Clustering for Robustness

1 code implementation13 Jun 2020 Motasem Alfarra, Juan C. Pérez, Adel Bibi, Ali Thabet, Pablo Arbeláez, Bernard Ghanem

This paper studies how encouraging semantically-aligned features during deep neural network training can increase network robustness.

Clustering

Adaptive Learning of the Optimal Batch Size of SGD

no code implementations3 May 2020 Motasem Alfarra, Slavomir Hanzely, Alyazeed Albasyoni, Bernard Ghanem, Peter Richtarik

Recent advances in the theoretical understanding of SGD led to a formula for the optimal batch size minimizing the number of effective data passes, i. e., the number of iterations times the batch size.

On the Decision Boundaries of Neural Networks: A Tropical Geometry Perspective

no code implementations20 Feb 2020 Motasem Alfarra, Adel Bibi, Hasan Hammoud, Mohamed Gaafar, Bernard Ghanem

Our main finding is that the decision boundaries are a subset of a tropical hypersurface, which is intimately related to a polytope formed by the convex hull of two zonotopes.

Network Pruning

On the Decision Boundaries of Deep Neural Networks: A Tropical Geometry Perspective

no code implementations25 Sep 2019 Motasem Alfarra, Adel Bibi, Hasan Hammoud, Mohamed Gaafar, Bernard Ghanem

We use tropical geometry, a new development in the area of algebraic geometry, to provide a characterization of the decision boundaries of a simple neural network of the form (Affine, ReLU, Affine).

Network Pruning

Cannot find the paper you are looking for? You can Submit a new open access paper.