Search Results for author: Mohammad Mohammadi Amiri

Found 13 papers, 1 papers with code

Fundamentals of Task-Agnostic Data Valuation

no code implementations25 Aug 2022 Mohammad Mohammadi Amiri, Frederic Berdoz, Ramesh Raskar

We capture these statistical differences through second moment by measuring diversity and relevance of the seller's data for the buyer; we estimate these measures through queries to the seller without requesting raw data.

Data Valuation

Private independence testing across two parties

no code implementations8 Jul 2022 Praneeth Vepakomma, Mohammad Mohammadi Amiri, Clément L. Canonne, Ramesh Raskar, Alex Pentland

We introduce $\pi$-test, a privacy-preserving algorithm for testing statistical independence between data distributed across multiple parties.

Privacy Preserving Vocal Bursts Valence Prediction

Federated Learning with Downlink Device Selection

no code implementations7 Jul 2021 Mohammad Mohammadi Amiri, Sanjeev R. Kulkarni, H. Vincent Poor

At each iteration, the PS broadcasts different quantized global model updates to different participating devices based on the last global model estimates available at the devices.

Federated Learning Image Classification

Blind Federated Edge Learning

no code implementations19 Oct 2020 Mohammad Mohammadi Amiri, Tolga M. Duman, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor

At each iteration, wireless devices perform local updates using their local data and the most recent global model received from the PS, and send their local updates to the PS over a wireless fading multiple access channel (MAC).

Communicate to Learn at the Edge

no code implementations28 Sep 2020 Deniz Gunduz, David Burth Kurka, Mikolaj Jankowski, Mohammad Mohammadi Amiri, Emre Ozfatura, Sreejith Sreekumar

Bringing the success of modern machine learning (ML) techniques to mobile devices can enable many new services and businesses, but also poses significant technical and research challenges.

Wireless for Machine Learning

no code implementations31 Aug 2020 Henrik Hellström, José Mairton B. da Silva Jr, Mohammad Mohammadi Amiri, Mingzhe Chen, Viktoria Fodor, H. Vincent Poor, Carlo Fischione

As data generation increasingly takes place on devices without a wired connection, machine learning (ML) related traffic will be ubiquitous in wireless networks.

Active Learning BIG-bench Machine Learning +1

Convergence of Federated Learning over a Noisy Downlink

no code implementations25 Aug 2020 Mohammad Mohammadi Amiri, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor

The PS has access to the global model and shares it with the devices for local training, and the devices return the result of their local updates to the PS to update the global model.

Federated Learning Quantization

Federated Learning With Quantized Global Model Updates

no code implementations18 Jun 2020 Mohammad Mohammadi Amiri, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor

We analyze the convergence behavior of the proposed LFL algorithm assuming the availability of accurate local model updates at the server.

Federated Learning Quantization

A Compressive Sensing Approach for Federated Learning over Massive MIMO Communication Systems

no code implementations18 Mar 2020 Yo-Seb Jeon, Mohammad Mohammadi Amiri, Jun Li, H. Vincent Poor

One major challenge in system design is to reconstruct local gradient vectors accurately at the central server, which are computed-and-sent from the wireless devices.

Compressive Sensing Federated Learning +1

Convergence of Update Aware Device Scheduling for Federated Learning at the Wireless Edge

no code implementations28 Jan 2020 Mohammad Mohammadi Amiri, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor

At each iteration of FL, a subset of the devices are scheduled to transmit their local model updates to the PS over orthogonal channel resources, while each participating device must compress its model update to accommodate to its link capacity.

Federated Learning Scheduling

Federated Learning over Wireless Fading Channels

no code implementations23 Jul 2019 Mohammad Mohammadi Amiri, Deniz Gunduz

Overall these results show clear advantages for the proposed analog over-the-air DSGD scheme, which suggests that learning and communication algorithms should be designed jointly to achieve the best end-to-end performance in machine learning applications at the wireless edge.

Federated Learning

Collaborative Machine Learning at the Wireless Edge with Blind Transmitters

no code implementations8 Jul 2019 Mohammad Mohammadi Amiri, Tolga M. Duman, Deniz Gunduz

At each iteration of the DSGD algorithm wireless devices compute gradient estimates with their local datasets, and send them to the PS over a wireless fading multiple access channel (MAC).

BIG-bench Machine Learning

Machine Learning at the Wireless Edge: Distributed Stochastic Gradient Descent Over-the-Air

1 code implementation3 Jan 2019 Mohammad Mohammadi Amiri, Deniz Gunduz

Following this digital approach, we introduce D-DSGD, in which the wireless devices employ gradient quantization and error accumulation, and transmit their gradient estimates to the PS over a multiple access channel (MAC).

BIG-bench Machine Learning Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.