Search Results for author: Syed Zawad

Found 6 papers, 1 papers with code

Speed Up Federated Learning in Heterogeneous Environment: A Dynamic Tiering Approach

1 code implementation9 Dec 2023 Seyed Mahmoud Sajjadi Mohammadabadi, Syed Zawad, Feng Yan, Lei Yang

The dynamic tier scheduler assigns clients to suitable tiers to minimize the overall training time in each round.

 Ranked #1 on Image Classification on CIFAR-10 (training time (s) metric)

Federated Learning Image Classification

SMLT: A Serverless Framework for Scalable and Adaptive Machine Learning Design and Training

no code implementations4 May 2022 Ahsan Ali, Syed Zawad, Paarijaat Aditya, Istemi Ekin Akkus, Ruichuan Chen, Feng Yan

In addition, by providing an end-to-end design, SMLT solves the intrinsic problems in serverless platforms such as the communication overhead, limited function execution duration, need for repeated initialization, and also provides explicit fault tolerance for ML training.

BIG-bench Machine Learning Management +1

Demystifying Hyperparameter Optimization in Federated Learning

no code implementations29 Sep 2021 Syed Zawad, Jun Yi, Minjia Zhang, Cheng Li, Feng Yan, Yuxiong He

Such data heterogeneity and privacy requirements bring unique challenges for learning hyperparameter optimization as the training dynamics change across clients even within the same training round and they are difficult to measure due to privacy constraints.

Federated Learning Hyperparameter Optimization +1

Curse or Redemption? How Data Heterogeneity Affects the Robustness of Federated Learning

no code implementations1 Feb 2021 Syed Zawad, Ahsan Ali, Pin-Yu Chen, Ali Anwar, Yi Zhou, Nathalie Baracaldo, Yuan Tian, Feng Yan

Data heterogeneity has been identified as one of the key features in federated learning but often overlooked in the lens of robustness to adversarial attacks.

Federated Learning

TiFL: A Tier-based Federated Learning System

no code implementations25 Jan 2020 Zheng Chai, Ahsan Ali, Syed Zawad, Stacey Truex, Ali Anwar, Nathalie Baracaldo, Yi Zhou, Heiko Ludwig, Feng Yan, Yue Cheng

To this end, we propose TiFL, a Tier-based Federated Learning System, which divides clients into tiers based on their training performance and selects clients from the same tier in each training round to mitigate the straggler problem caused by heterogeneity in resource and data quantity.

Federated Learning

EPNAS: Efficient Progressive Neural Architecture Search

no code implementations7 Jul 2019 Yanqi Zhou, Peng Wang, Sercan Arik, Haonan Yu, Syed Zawad, Feng Yan, Greg Diamos

In this paper, we propose Efficient Progressive Neural Architecture Search (EPNAS), a neural architecture search (NAS) that efficiently handles large search space through a novel progressive search policy with performance prediction based on REINFORCE~\cite{Williams. 1992. PG}.

Neural Architecture Search

Cannot find the paper you are looking for? You can Submit a new open access paper.