1 code implementation • 14 Sep 2023 • Burak Hasircioglu, Deniz Gunduz
The task of preserving privacy while ensuring efficient communication is a fundamental challenge in federated learning.
no code implementations • 3 May 2022 • Burak Hasircioglu, Deniz Gunduz
In this work, in a federated setting, we consider random participation of the clients in addition to subsampling their local datasets.
1 code implementation • 7 Feb 2022 • Selim F. Yilmaz, Burak Hasircioglu, Deniz Gunduz
We consider distributed inference at the wireless edge, where multiple clients with an ensemble of models, each trained independently on a local dataset, are queried in parallel to make an accurate decision on a new sample.
no code implementations • 16 Feb 2021 • Burak Hasircioglu, Jesus Gomez-Vilardebo, Deniz Gunduz
We consider the problem of private distributed matrix multiplication under limited resources.
Information Theory Cryptography and Security Distributed, Parallel, and Cluster Computing Information Theory
1 code implementation • 27 Jan 2021 • Mohammad Malekzadeh, Burak Hasircioglu, Nitish Mital, Kunal Katarya, Mehmet Emre Ozfatura, Deniz Gündüz
While rich medical datasets are hosted in hospitals distributed across the world, concerns on patients' privacy is a barrier against using such data to train deep neural networks (DNNs) for medical diagnostics.
no code implementations • 17 Nov 2020 • Burak Hasircioglu, Deniz Gunduz
In conventional federated learning (FL), differential privacy (DP) guarantees can be obtained by injecting additional noise to local model updates before transmitting to the parameter server (PS).