Search Results for author: Kerem Ozfatura

Found 7 papers, 0 papers with code

Aggressive or Imperceptible, or Both: Network Pruning Assisted Hybrid Byzantines in Federated Learning

no code implementations9 Apr 2024 Emre Ozfatura, Kerem Ozfatura, Alptekin Kupcu, Deniz Gunduz

Hence, inspired by the sparse neural networks, we introduce a hybrid sparse Byzantine attack that is composed of two parts: one exhibiting a sparse nature and attacking only certain NN locations with higher sensitivity, and the other being more silent but accumulating over time, where each ideally targets a different type of defence mechanism, and together they form a strong but imperceptible attack.

Federated Learning Network Pruning +1

Byzantines can also Learn from History: Fall of Centered Clipping in Federated Learning

no code implementations21 Aug 2022 Kerem Ozfatura, Emre Ozfatura, Alptekin Kupcu, Deniz Gunduz

The centered clipping (CC) framework has further shown that the momentum term from the previous iteration, besides reducing the variance, can be used as a reference point to neutralize Byzantine attacks better.

Federated Learning Image Classification

Federated Spatial Reuse Optimization in Next-Generation Decentralized IEEE 802.11 WLANs

no code implementations20 Mar 2022 Francesc Wilhelmi, Jernej Hribar, Selim F. Yilmaz, Emre Ozfatura, Kerem Ozfatura, Ozlem Yildiz, Deniz Gündüz, Hao Chen, Xiaoying Ye, Lizhao You, Yulin Shao, Paolo Dini, Boris Bellalta

As wireless standards evolve, more complex functionalities are introduced to address the increasing requirements in terms of throughput, latency, security, and efficiency.

Federated Learning

Less is More: Feature Selection for Adversarial Robustness with Compressive Counter-Adversarial Attacks

no code implementations ICML Workshop AML 2021 Emre Ozfatura, Muhammad Zaid Hameed, Kerem Ozfatura, Deniz Gunduz

Hence, we propose a novel approach to identify the important features by employing counter-adversarial attacks, which highlights the consistency at the penultimate layer with respect to perturbations on input samples.

Adversarial Robustness feature selection

Time-Correlated Sparsification for Communication-Efficient Federated Learning

no code implementations21 Jan 2021 Emre Ozfatura, Kerem Ozfatura, Deniz Gunduz

Sparse communication is often employed to reduce the communication load, where only a small subset of the model updates are communicated from the clients to the PS.

Federated Learning Quantization

FedADC: Accelerated Federated Learning with Drift Control

no code implementations16 Dec 2020 Kerem Ozfatura, Emre Ozfatura, Deniz Gunduz

The core of the FL strategy is the use of stochastic gradient descent (SGD) in a distributed manner.

Federated Learning

Distributed Sparse SGD with Majority Voting

no code implementations12 Nov 2020 Kerem Ozfatura, Emre Ozfatura, Deniz Gunduz

However, top-K sparsification requires additional communication load to represent the sparsity pattern, and the mismatch between the sparsity patterns of the workers prevents exploitation of efficient communication protocols.

Cannot find the paper you are looking for? You can Submit a new open access paper.