Search Results for author: Ravi Madduri

Found 5 papers, 3 papers with code

Advances in APPFL: A Comprehensive and Extensible Federated Learning Framework

1 code implementation17 Sep 2024 Zilinghan Li, Shilan He, Ze Yang, Minseok Ryu, Kibaek Kim, Ravi Madduri

Federated learning (FL) is a distributed machine learning paradigm enabling collaborative model training while preserving data privacy.

Benchmarking Federated Learning

AI Data Readiness Inspector (AIDRIN) for Quantitative Assessment of Data Readiness for AI

no code implementations27 Jun 2024 Kaveen Hiniduma, Suren Byna, Jean Luca Bez, Ravi Madduri

"Garbage In Garbage Out" is a universally agreed quote by computer scientists from various domains, including Artificial Intelligence (AI).

Fairness Feature Importance

Secure Federated Learning Across Heterogeneous Cloud and High-Performance Computing Resources -- A Case Study on Federated Fine-tuning of LLaMA 2

no code implementations19 Feb 2024 Zilinghan Li, Shilan He, Pranshu Chaturvedi, Volodymyr Kindratenko, Eliu A Huerta, Kibaek Kim, Ravi Madduri

Federated learning enables multiple data owners to collaboratively train robust machine learning models without transferring large or sensitive local datasets by only sharing the parameters of the locally trained models.

Cloud Computing Federated Learning +1

FedCompass: Efficient Cross-Silo Federated Learning on Heterogeneous Client Devices using a Computing Power Aware Scheduler

1 code implementation26 Sep 2023 Zilinghan Li, Pranshu Chaturvedi, Shilan He, Han Chen, Gagandeep Singh, Volodymyr Kindratenko, E. A. Huerta, Kibaek Kim, Ravi Madduri

Nonetheless, because of the disparity of computing resources among different clients (i. e., device heterogeneity), synchronous federated learning algorithms suffer from degraded efficiency when waiting for straggler clients.

Federated Learning

APPFLx: Providing Privacy-Preserving Cross-Silo Federated Learning as a Service

1 code implementation17 Aug 2023 Zilinghan Li, Shilan He, Pranshu Chaturvedi, Trung-Hieu Hoang, Minseok Ryu, E. A. Huerta, Volodymyr Kindratenko, Jordan Fuhrman, Maryellen Giger, Ryan Chard, Kibaek Kim, Ravi Madduri

Cross-silo privacy-preserving federated learning (PPFL) is a powerful tool to collaboratively train robust and generalized machine learning (ML) models without sharing sensitive (e. g., healthcare of financial) local data.

Federated Learning Privacy Preserving

Cannot find the paper you are looking for? You can Submit a new open access paper.