Search Results for author: Mohammad Mohammadi Amiri

Found 10 papers, 0 papers with code

Federated Learning with Downlink Device Selection

no code implementations7 Jul 2021 Mohammad Mohammadi Amiri, Sanjeev R. Kulkarni, H. Vincent Poor

At each iteration, the PS broadcasts different quantized global model updates to different participating devices based on the last global model estimates available at the devices.

Federated Learning Image Classification

Blind Federated Edge Learning

no code implementations19 Oct 2020 Mohammad Mohammadi Amiri, Tolga M. Duman, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor

At each iteration, wireless devices perform local updates using their local data and the most recent global model received from the PS, and send their local updates to the PS over a wireless fading multiple access channel (MAC).

Communicate to Learn at the Edge

no code implementations28 Sep 2020 Deniz Gunduz, David Burth Kurka, Mikolaj Jankowski, Mohammad Mohammadi Amiri, Emre Ozfatura, Sreejith Sreekumar

Bringing the success of modern machine learning (ML) techniques to mobile devices can enable many new services and businesses, but also poses significant technical and research challenges.

Convergence of Federated Learning over a Noisy Downlink

no code implementations25 Aug 2020 Mohammad Mohammadi Amiri, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor

The PS has access to the global model and shares it with the devices for local training, and the devices return the result of their local updates to the PS to update the global model.

Federated Learning Quantization

Federated Learning With Quantized Global Model Updates

no code implementations18 Jun 2020 Mohammad Mohammadi Amiri, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor

We analyze the convergence behavior of the proposed LFL algorithm assuming the availability of accurate local model updates at the server.

Federated Learning Quantization

A Compressive Sensing Approach for Federated Learning over Massive MIMO Communication Systems

no code implementations18 Mar 2020 Yo-Seb Jeon, Mohammad Mohammadi Amiri, Jun Li, H. Vincent Poor

One major challenge in system design is to reconstruct local gradient vectors accurately at the central server, which are computed-and-sent from the wireless devices.

Compressive Sensing Federated Learning

Convergence of Update Aware Device Scheduling for Federated Learning at the Wireless Edge

no code implementations28 Jan 2020 Mohammad Mohammadi Amiri, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor

At each iteration of FL, a subset of the devices are scheduled to transmit their local model updates to the PS over orthogonal channel resources, while each participating device must compress its model update to accommodate to its link capacity.

Federated Learning

Federated Learning over Wireless Fading Channels

no code implementations23 Jul 2019 Mohammad Mohammadi Amiri, Deniz Gunduz

Overall these results show clear advantages for the proposed analog over-the-air DSGD scheme, which suggests that learning and communication algorithms should be designed jointly to achieve the best end-to-end performance in machine learning applications at the wireless edge.

Federated Learning

Collaborative Machine Learning at the Wireless Edge with Blind Transmitters

no code implementations8 Jul 2019 Mohammad Mohammadi Amiri, Tolga M. Duman, Deniz Gunduz

At each iteration of the DSGD algorithm wireless devices compute gradient estimates with their local datasets, and send them to the PS over a wireless fading multiple access channel (MAC).

Machine Learning at the Wireless Edge: Distributed Stochastic Gradient Descent Over-the-Air

no code implementations3 Jan 2019 Mohammad Mohammadi Amiri, Deniz Gunduz

Following this digital approach, we introduce D-DSGD, in which the wireless devices employ gradient quantization and error accumulation, and transmit their gradient estimates to the PS over a multiple access channel (MAC).

Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.