Search Results for author: Basak Guler

Found 11 papers, 0 papers with code

FLASH: Federated Learning Across Simultaneous Heterogeneities

no code implementations13 Feb 2024 Xiangyu Chang, Sk Miraj Ahmed, Srikanth V. Krishnamurthy, Basak Guler, Ananthram Swami, Samet Oymak, Amit K. Roy-Chowdhury

The key premise of federated learning (FL) is to train ML models across a diverse set of data-owners (clients), without exchanging local data.

Federated Learning Multi-Armed Bandits

Plug-and-Play Transformer Modules for Test-Time Adaptation

no code implementations6 Jan 2024 Xiangyu Chang, Sk Miraj Ahmed, Srikanth V. Krishnamurthy, Basak Guler, Ananthram Swami, Samet Oymak, Amit K. Roy-Chowdhury

Parameter-efficient tuning (PET) methods such as LoRA, Adapter, and Visual Prompt Tuning (VPT) have found success in enabling adaptation to new domains by tuning small modules within a transformer model.

Test-time Adaptation Visual Prompt Tuning

Sparsified Secure Aggregation for Privacy-Preserving Federated Learning

no code implementations23 Dec 2021 Irem Ergun, Hasin Us Sami, Basak Guler

Secure aggregation is a popular protocol in privacy-preserving federated learning, which allows model aggregation without revealing the individual models in the clear.

Federated Learning Privacy Preserving

LightSecAgg: a Lightweight and Versatile Design for Secure Aggregation in Federated Learning

no code implementations29 Sep 2021 Jinhyun So, Chaoyang He, Chien-Sheng Yang, Songze Li, Qian Yu, Ramy E. Ali, Basak Guler, Salman Avestimehr

We also demonstrate that, unlike existing schemes, LightSecAgg can be applied to secure aggregation in the asynchronous FL setting.

Federated Learning

Securing Secure Aggregation: Mitigating Multi-Round Privacy Leakage in Federated Learning

no code implementations7 Jun 2021 Jinhyun So, Ramy E. Ali, Basak Guler, Jiantao Jiao, Salman Avestimehr

In fact, we show that the conventional random user selection strategies in FL lead to leaking users' individual models within number of rounds that is linear in the number of users.

Fairness Federated Learning

Sustainable Federated Learning

no code implementations22 Feb 2021 Basak Guler, Aylin Yener

Potential environmental impact of machine learning by large-scale wireless networks is a major challenge for the sustainability of future smart ecosystems.

BIG-bench Machine Learning Federated Learning

Energy-Harvesting Distributed Machine Learning

no code implementations10 Feb 2021 Basak Guler, Aylin Yener

This paper provides a first study of utilizing energy harvesting for sustainable machine learning in distributed networks.

BIG-bench Machine Learning Edge-computing

A Scalable Approach for Privacy-Preserving Collaborative Machine Learning

no code implementations NeurIPS 2020 Jinhyun So, Basak Guler, A. Salman Avestimehr

We consider a collaborative learning scenario in which multiple data-owners wish to jointly train a logistic regression model, while keeping their individual datasets private from the other parties.

BIG-bench Machine Learning Privacy Preserving

Byzantine-Resilient Secure Federated Learning

no code implementations21 Jul 2020 Jinhyun So, Basak Guler, A. Salman Avestimehr

This presents a major challenge for the resilience of the model against adversarial (Byzantine) users, who can manipulate the global model by modifying their local models or datasets.

Federated Learning Outlier Detection +2

Turbo-Aggregate: Breaking the Quadratic Aggregation Barrier in Secure Federated Learning

no code implementations11 Feb 2020 Jinhyun So, Basak Guler, A. Salman Avestimehr

A major bottleneck in scaling federated learning to a large number of users is the overhead of secure model aggregation across many users.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.