Search Results for author: Deniz Gunduz

Found 57 papers, 8 papers with code

DRF Codes: Deep SNR-Robust Feedback Codes

no code implementations22 Dec 2021 Mahdi Boloursaz Mashhadi, Deniz Gunduz, Alberto Perotti, Branislav Popovic

We present a new deep-neural-network (DNN) based error correction code for fading channels with output feedback, called deep SNR-robust feedback (DRF) code.

Privacy-Aware Communication Over the Wiretap Channel with Generative Networks

no code implementations8 Oct 2021 Ecenaz Erdemir, Pier Luigi Dragotti, Deniz Gunduz

We study privacy-aware communication over a wiretap channel using end-to-end learning.

Bayesian Over-The-Air Computation

no code implementations8 Sep 2021 Yulin Shao, Deniz Gunduz, Soung Chang Liew

In particular, in the misaligned OAC where there are channel misalignments among transmitted signals, ML estimation suffers from severe error propagation and noise enhancement.

Learning to Minimize Age of Information over an Unreliable Channel with Energy Harvesting

no code implementations30 Jun 2021 Elif Tugce Ceran, Deniz Gunduz, Andras Gyorgy

The time average expected age of information (AoI) is studied for status updates sent over an error-prone channel from an energy-harvesting transmitter with a finite-capacity battery.

Neural Distributed Image Compression using Common Information

2 code implementations22 Jun 2021 Nitish Mital, Ezgi Ozyilkan, Ali Garjani, Deniz Gunduz

The received latent representation and the locally generated common information are passed through a decoder network to obtain an enhanced reconstruction of the input image.

Image Compression

Less is More: Feature Selection for Adversarial Robustness with Compressive Counter-Adversarial Attacks

no code implementations ICML Workshop AML 2021 Emre Ozfatura, Muhammad Zaid Hameed, Kerem Ozfatura, Deniz Gunduz

Hence, we propose a novel approach to identify the important features by employing counter-adversarial attacks, which highlights the consistency at the penultimate layer with respect to perturbations on input samples.

Adversarial Robustness

AirNet: Neural Network Transmission over the Air

no code implementations24 May 2021 Mikolaj Jankowski, Deniz Gunduz, Krystian Mikolajczyk

State-of-the-art performance for many emerging edge applications is achieved by deep neural networks (DNNs).

Knowledge Distillation

Denoising Noisy Neural Networks: A Bayesian Approach with Compensation

1 code implementation22 May 2021 Yulin Shao, Soung Chang Liew, Deniz Gunduz

Deep neural networks (DNNs) with noisy weights, which we refer to as noisy neural networks (NoisyNNs), arise from the training and inference of DNNs in the presence of noise.

Denoising Quantization

Deep Extended Feedback Codes

no code implementations4 May 2021 Anahid Robert Safavi, Alberto G. Perotti, Branislav M. Popovic, Mahdi Boloursaz Mashhadi, Deniz Gunduz

A new deep-neural-network (DNN) based error correction encoder architecture for channels with feedback, called Deep Extended Feedback (DEF), is presented in this paper.

LIDAR and Position-Aided mmWave Beam Selection with Non-local CNNs and Curriculum Training

1 code implementation29 Apr 2021 Matteo Zecchin, Mahdi Boloursaz Mashhadi, Mikolaj Jankowski, Deniz Gunduz, Marios Kountouris, David Gesbert

Efficient millimeter wave (mmWave) beam selection in vehicle-to-infrastructure (V2I) communication is a crucial yet challenging task due to the narrow mmWave beamwidth and high user mobility.

Knowledge Distillation

Gradient Coding with Dynamic Clustering for Straggler-Tolerant Distributed Learning

no code implementations1 Mar 2021 Baturalp Buyukates, Emre Ozfatura, Sennur Ulukus, Deniz Gunduz

Distributed implementations are crucial in speeding up large scale machine learning applications.

Federated Edge Learning with Misaligned Over-The-Air Computation

1 code implementation26 Feb 2021 Yulin Shao, Deniz Gunduz, Soung Chang Liew

Over-the-air computation (OAC) is a promising technique to realize fast model aggregation in the uplink of federated edge learning.

A Reinforcement Learning Approach to Age of Information in Multi-User Networks with HARQ

no code implementations19 Feb 2021 Elif Tugce Ceran, Deniz Gunduz, Andras Gyorgy

Scheduling the transmission of time-sensitive information from a source node to multiple users over error-prone communication channels is studied with the goal of minimizing the long-term average age of information (AoI) at the users.

Active Privacy-utility Trade-off Against a Hypothesis Testing Adversary

no code implementations16 Feb 2021 Ecenaz Erdemir, Pier Luigi Dragotti, Deniz Gunduz

We consider a user releasing her data containing some personal information in return of a service.

Speeding Up Private Distributed Matrix Multiplication via Bivariate Polynomial Codes

no code implementations16 Feb 2021 Burak Hasircioglu, Jesus Gomez-Vilardebo, Deniz Gunduz

We consider the problem of private distributed matrix multiplication under limited resources.

Information Theory Cryptography and Security Distributed, Parallel, and Cluster Computing Information Theory

Federated mmWave Beam Selection Utilizing LIDAR Data

3 code implementations4 Feb 2021 Mahdi Boloursaz Mashhadi, Mikolaj Jankowski, Tze-Yang Tung, Szymon Kobus, Deniz Gunduz

Efficient link configuration in millimeter wave (mmWave) communication systems is a crucial yet challenging task due to the overhead imposed by beam selection.

Time-Correlated Sparsification for Communication-Efficient Federated Learning

no code implementations21 Jan 2021 Emre Ozfatura, Kerem Ozfatura, Deniz Gunduz

Sparse communication is often employed to reduce the communication load, where only a small subset of the model updates are communicated from the clients to the PS.

Federated Learning Quantization

Effective Communications: A Joint Learning and Communication Framework for Multi-Agent Reinforcement Learning over Noisy Channels

no code implementations2 Jan 2021 Tze-Yang Tung, Szymon Kobus, Joan Roig Pujol, Deniz Gunduz

Specifically, we consider a multi-agent partially observable Markov decision process (MA-POMDP), in which the agents, in addition to interacting with the environment can also communicate with each other over a noisy communication channel.

Multi-agent Reinforcement Learning

FedADC: Accelerated Federated Learning with Drift Control

no code implementations16 Dec 2020 Emre Ozfatura, Kerem Ozfatura, Deniz Gunduz

The core of the FL strategy is the use of stochastic gradient descent (SGD) in a distributed manner.

Federated Learning

Private Wireless Federated Learning with Anonymous Over-the-Air Computation

no code implementations17 Nov 2020 Burak Hasircioglu, Deniz Gunduz

In conventional federated learning (FL), differential privacy (DP) guarantees can be obtained by injecting additional noise to local model updates before transmitting to the parameter server (PS).

Federated Learning

Distributed Sparse SGD with Majority Voting

no code implementations12 Nov 2020 Kerem Ozfatura, Emre Ozfatura, Deniz Gunduz

However, top-K sparsification requires additional communication load to represent the sparsity pattern, and the mismatch between the sparsity patterns of the workers prevents exploitation of efficient communication protocols.

Gradient Coding with Dynamic Clustering for Straggler Mitigation

no code implementations3 Nov 2020 Baturalp Buyukates, Emre Ozfatura, Sennur Ulukus, Deniz Gunduz

In distributed synchronous gradient descent (GD) the main performance bottleneck for the per-iteration completion time is the slowest \textit{straggling} workers.

Blind Federated Edge Learning

no code implementations19 Oct 2020 Mohammad Mohammadi Amiri, Tolga M. Duman, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor

At each iteration, wireless devices perform local updates using their local data and the most recent global model received from the PS, and send their local updates to the PS over a wireless fading multiple access channel (MAC).

Communicate to Learn at the Edge

no code implementations28 Sep 2020 Deniz Gunduz, David Burth Kurka, Mikolaj Jankowski, Mohammad Mohammadi Amiri, Emre Ozfatura, Sreejith Sreekumar

Bringing the success of modern machine learning (ML) techniques to mobile devices can enable many new services and businesses, but also poses significant technical and research challenges.

Meta-learning based Alternating Minimization Algorithm for Non-convex Optimization

no code implementations9 Sep 2020 Jingyuan Xia, Shengxi Li, Jun-Jie Huang, Imad Jaimoukha, Deniz Gunduz

In this paper, we propose a novel solution for non-convex problems of multiple variables, especially for those typically solved by an alternating minimization (AM) strategy that splits the original optimization problem into a set of sub-problems corresponding to each variable, and then iteratively optimize each sub-problem using a fixed updating rule.

Matrix Completion Meta-Learning

Convergence of Federated Learning over a Noisy Downlink

no code implementations25 Aug 2020 Mohammad Mohammadi Amiri, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor

The PS has access to the global model and shares it with the devices for local training, and the devices return the result of their local updates to the PS to update the global model.

Federated Learning Quantization

Wireless Image Retrieval at the Edge

no code implementations21 Jul 2020 Mikolaj Jankowski, Deniz Gunduz, Krystian Mikolajczyk

We propose two alternative schemes based on digital and analog communications, respectively.

Image Compression Image Retrieval

Secure Distributed Matrix Computation with Discrete Fourier Transform

2 code implementations8 Jul 2020 Nitish Mital, Cong Ling, Deniz Gunduz

We consider the problem of secure distributed matrix computation (SDMC), where a \textit{user} queries a function of data matrices generated at distributed \textit{source} nodes.

Information Theory Cryptography and Security Distributed, Parallel, and Cluster Computing Information Theory

Coded Distributed Computing with Partial Recovery

no code implementations4 Jul 2020 Emre Ozfatura, Sennur Ulukus, Deniz Gunduz

In this paper, we first introduce a novel coded matrix-vector multiplication scheme, called coded computation with partial recovery (CCPR), which benefits from the advantages of both coded and uncoded computation schemes, and reduces both the computation time and the decoding complexity by allowing a trade-off between the accuracy and the speed of computation.

Distributed Computing

Pruning the Pilots: Deep Learning-Based Pilot Design and Channel Estimation for MIMO-OFDM Systems

no code implementations21 Jun 2020 Mahdi Boloursaz Mashhadi, Deniz Gunduz

Our pruning-based pilot reduction technique reduces the overhead by allocating pilots across subcarriers non-uniformly and exploiting the inter-frequency and inter-antenna correlations in the channel matrix efficiently through convolutional layers and attention module.

Federated Learning With Quantized Global Model Updates

no code implementations18 Jun 2020 Mohammad Mohammadi Amiri, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor

We analyze the convergence behavior of the proposed LFL algorithm assuming the availability of accurate local model updates at the server.

Federated Learning Quantization

Age-Based Coded Computation for Bias Reduction in Distributed Learning

no code implementations2 Jun 2020 Emre Ozfatura, Baturalp Buyukates, Deniz Gunduz, Sennur Ulukus

To mitigate biased estimators, we design a $timely$ dynamic encoding framework for partial recovery that includes an ordering operator that changes the codewords and computation orders at workers over time.

Straggler-aware Distributed Learning: Communication Computation Latency Trade-off

no code implementations10 Apr 2020 Emre Ozfatura, Sennur Ulukus, Deniz Gunduz

When gradient descent (GD) is scaled to many parallel workers for large scale machine learning problems, its per-iteration computation time is limited by the straggling workers.

Distributed Deep Convolutional Compression for Massive MIMO CSI Feedback

no code implementations7 Mar 2020 Mahdi Boloursaz Mashhadi, Qianqian Yang, Deniz Gunduz

We also propose a distributed version of DeepCMC for a multi-user MIMO scenario to encode and reconstruct the CSI from multiple users in a distributed manner.

Quantization

Decentralized SGD with Over-the-Air Computation

no code implementations6 Mar 2020 Emre Ozfatura, Stefano Rini, Deniz Gunduz

We study the performance of decentralized stochastic gradient descent (DSGD) in a wireless network, where the nodes collaboratively optimize an objective function using their local datasets.

Image Classification

Joint Device-Edge Inference over Wireless Links with Pruning

no code implementations4 Mar 2020 Mikolaj Jankowski, Deniz Gunduz, Krystian Mikolajczyk

We propose a joint feature compression and transmission scheme for efficient inference at the wireless network edge.

General Classification Image Classification +1

Privacy-Aware Time-Series Data Sharing with Deep Reinforcement Learning

no code implementations4 Mar 2020 Ecenaz Erdemir, Pier Luigi Dragotti, Deniz Gunduz

We measure the privacy leakage by the mutual information between the user's true data sequence and shared version.

Time Series

Convergence of Update Aware Device Scheduling for Federated Learning at the Wireless Edge

no code implementations28 Jan 2020 Mohammad Mohammadi Amiri, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor

At each iteration of FL, a subset of the devices are scheduled to transmit their local model updates to the PS over orthogonal channel resources, while each participating device must compress its model update to accommodate to its link capacity.

Federated Learning

One-Bit Over-the-Air Aggregation for Communication-Efficient Federated Edge Learning: Design and Convergence Analysis

no code implementations16 Jan 2020 Guangxu Zhu, Yuqing Du, Deniz Gunduz, Kaibin Huang

We provide a comprehensive analysis of the effects of wireless channel hostilities (channel noise, fading, and channel estimation errors) on the convergence rate of the proposed FEEL scheme.

Information Theory Distributed, Parallel, and Cluster Computing Networking and Internet Architecture Signal Processing Information Theory

Deep Joint Source-Channel Coding for Wireless Image Retrieval

no code implementations28 Oct 2019 Mikolaj Jankowski, Deniz Gunduz, Krystian Mikolajczyk

Motivated by surveillance applications with wireless cameras or drones, we consider the problem of image retrieval over a wireless channel.

Image Retrieval

CNN-based Analog CSI Feedback in FDD MIMO-OFDM Systems

no code implementations23 Oct 2019 Mahdi Boloursaz Mashhadi, Qianqian Yang, Deniz Gunduz

Massive multiple-input multiple-output (MIMO) systems require downlink channel state information (CSI) at the base station (BS) to better utilize the available spatial diversity and multiplexing gains.

Quantization

Hierarchical Federated Learning Across Heterogeneous Cellular Networks

no code implementations5 Sep 2019 Mehdi Salehi Heydar Abad, Emre Ozfatura, Deniz Gunduz, Ozgur Ercetin

We study collaborative machine learning (ML) across wireless devices, each with its own local dataset.

Federated Learning

Federated Learning over Wireless Fading Channels

no code implementations23 Jul 2019 Mohammad Mohammadi Amiri, Deniz Gunduz

Overall these results show clear advantages for the proposed analog over-the-air DSGD scheme, which suggests that learning and communication algorithms should be designed jointly to achieve the best end-to-end performance in machine learning applications at the wireless edge.

Federated Learning

Privacy-Aware Location Sharing with Deep Reinforcement Learning

no code implementations17 Jul 2019 Ecenaz Erdemir, Pier Luigi Dragotti, Deniz Gunduz

Existing approaches are mainly focused on privacy of sharing a single location or myopic location trace privacy; neither of them taking into account the temporal correlations between the past and current locations.

Information Theory Cryptography and Security Information Theory

Collaborative Machine Learning at the Wireless Edge with Blind Transmitters

no code implementations8 Jul 2019 Mohammad Mohammadi Amiri, Tolga M. Duman, Deniz Gunduz

At each iteration of the DSGD algorithm wireless devices compute gradient estimates with their local datasets, and send them to the PS over a wireless fading multiple access channel (MAC).

Machine Learning in the Air

no code implementations28 Apr 2019 Deniz Gunduz, Paul de Kerret, Nicholas D. Sidiropoulos, David Gesbert, Chandra Murthy, Mihaela van der Schaar

Thanks to the recent advances in processing speed and data acquisition and storage, machine learning (ML) is penetrating every facet of our lives, and transforming research in many areas in a fundamental manner.

Practical Functional Regenerating Codes for Broadcast Repair of Multiple Nodes

2 code implementations15 Apr 2019 Nitish Mital, Katina Kralevska, Cong Ling, Deniz Gunduz

A code construction and repair scheme for optimal functional regeneration of multiple node failures is presented, which is based on stitching together short MDS codes on carefully chosen sets of points lying on a linearized polynomial.

Information Theory Information Theory

Successive Refinement of Images with Deep Joint Source-Channel Coding

no code implementations15 Mar 2019 David Burth Kurka, Deniz Gunduz

We introduce deep learning based communication methods for successive refinement of images over wireless channels.

Gradient Coding with Clustering and Multi-message Communication

no code implementations5 Mar 2019 Emre Ozfatura, Deniz Gunduz, Sennur Ulukus

Gradient descent (GD) methods are commonly employed in machine learning problems to optimize the parameters of the model in an iterative fashion.

Distributed Computing

The Best Defense Is a Good Offense: Adversarial Attacks to Avoid Modulation Detection

no code implementations27 Feb 2019 Muhammad Zaid Hameed, Andras Gyorgy, Deniz Gunduz

We consider a communication scenario, in which an intruder tries to determine the modulation scheme of the intercepted signal.

Curriculum Learning Image Classification

Machine Learning at the Wireless Edge: Distributed Stochastic Gradient Descent Over-the-Air

no code implementations3 Jan 2019 Mohammad Mohammadi Amiri, Deniz Gunduz

Following this digital approach, we introduce D-DSGD, in which the wireless devices employ gradient quantization and error accumulation, and transmit their gradient estimates to the PS over a multiple access channel (MAC).

Quantization

Distributed Gradient Descent with Coded Partial Gradient Computations

no code implementations22 Nov 2018 Emre Ozfatura, Sennur Ulukus, Deniz Gunduz

Coded computation techniques provide robustness against straggling servers in distributed computing, with the following limitations: First, they increase decoding complexity.

Distributed Computing

Deep Joint Source-Channel Coding for Wireless Image Transmission

no code implementations4 Sep 2018 Eirina Bourtsoulatze, David Burth Kurka, Deniz Gunduz

We propose a joint source and channel coding (JSCC) technique for wireless image transmission that does not rely on explicit codes for either compression or error correction; instead, it directly maps the image pixel values to the complex-valued channel input symbols.

Speeding Up Distributed Gradient Descent by Utilizing Non-persistent Stragglers

no code implementations7 Aug 2018 Emre Ozfatura, Deniz Gunduz, Sennur Ulukus

In most of the existing DGD schemes, either with coded computation or coded communication, the non-straggling CSs transmit one message per iteration once they complete all their assigned computation tasks.

Multi-Access Communications with Energy Harvesting: A Multi-Armed Bandit Model and the Optimality of the Myopic Policy

no code implementations1 Jan 2015 Pol Blasco, Deniz Gunduz

The energy arrival process at each node is modelled as an independent two-state Markov process, such that, at each TS, a node either harvests one unit of energy, or none.

Cannot find the paper you are looking for? You can Submit a new open access paper.