Search Results for author: Burak Hasircioglu

Found 6 papers, 3 papers with code

Communication Efficient Private Federated Learning Using Dithering

1 code implementation14 Sep 2023 Burak Hasircioglu, Deniz Gunduz

The task of preserving privacy while ensuring efficient communication is a fundamental challenge in federated learning.

Federated Learning Quantization

Privacy Amplification via Random Participation in Federated Learning

no code implementations3 May 2022 Burak Hasircioglu, Deniz Gunduz

In this work, in a federated setting, we consider random participation of the clients in addition to subsampling their local datasets.

Federated Learning

Over-the-Air Ensemble Inference with Model Privacy

1 code implementation7 Feb 2022 Selim F. Yilmaz, Burak Hasircioglu, Deniz Gunduz

We consider distributed inference at the wireless edge, where multiple clients with an ensemble of models, each trained independently on a local dataset, are queried in parallel to make an accurate decision on a new sample.

Speeding Up Private Distributed Matrix Multiplication via Bivariate Polynomial Codes

no code implementations16 Feb 2021 Burak Hasircioglu, Jesus Gomez-Vilardebo, Deniz Gunduz

We consider the problem of private distributed matrix multiplication under limited resources.

Information Theory Cryptography and Security Distributed, Parallel, and Cluster Computing Information Theory

Dopamine: Differentially Private Federated Learning on Medical Data

1 code implementation27 Jan 2021 Mohammad Malekzadeh, Burak Hasircioglu, Nitish Mital, Kunal Katarya, Mehmet Emre Ozfatura, Deniz Gündüz

While rich medical datasets are hosted in hospitals distributed across the world, concerns on patients' privacy is a barrier against using such data to train deep neural networks (DNNs) for medical diagnostics.

Federated Learning

Private Wireless Federated Learning with Anonymous Over-the-Air Computation

no code implementations17 Nov 2020 Burak Hasircioglu, Deniz Gunduz

In conventional federated learning (FL), differential privacy (DP) guarantees can be obtained by injecting additional noise to local model updates before transmitting to the parameter server (PS).

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.