no code implementations • 25 Aug 2022 • Mohammad Mohammadi Amiri, Frederic Berdoz, Ramesh Raskar
We capture these statistical differences through second moment by measuring diversity and relevance of the seller's data for the buyer; we estimate these measures through queries to the seller without requesting raw data.
no code implementations • 8 Jul 2022 • Praneeth Vepakomma, Mohammad Mohammadi Amiri, Clément L. Canonne, Ramesh Raskar, Alex Pentland
We introduce $\pi$-test, a privacy-preserving algorithm for testing statistical independence between data distributed across multiple parties.
no code implementations • 7 Jul 2021 • Mohammad Mohammadi Amiri, Sanjeev R. Kulkarni, H. Vincent Poor
At each iteration, the PS broadcasts different quantized global model updates to different participating devices based on the last global model estimates available at the devices.
no code implementations • 19 Oct 2020 • Mohammad Mohammadi Amiri, Tolga M. Duman, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor
At each iteration, wireless devices perform local updates using their local data and the most recent global model received from the PS, and send their local updates to the PS over a wireless fading multiple access channel (MAC).
no code implementations • 28 Sep 2020 • Deniz Gunduz, David Burth Kurka, Mikolaj Jankowski, Mohammad Mohammadi Amiri, Emre Ozfatura, Sreejith Sreekumar
Bringing the success of modern machine learning (ML) techniques to mobile devices can enable many new services and businesses, but also poses significant technical and research challenges.
no code implementations • 31 Aug 2020 • Henrik Hellström, José Mairton B. da Silva Jr, Mohammad Mohammadi Amiri, Mingzhe Chen, Viktoria Fodor, H. Vincent Poor, Carlo Fischione
As data generation increasingly takes place on devices without a wired connection, machine learning (ML) related traffic will be ubiquitous in wireless networks.
no code implementations • 25 Aug 2020 • Mohammad Mohammadi Amiri, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor
The PS has access to the global model and shares it with the devices for local training, and the devices return the result of their local updates to the PS to update the global model.
no code implementations • 18 Jun 2020 • Mohammad Mohammadi Amiri, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor
We analyze the convergence behavior of the proposed LFL algorithm assuming the availability of accurate local model updates at the server.
no code implementations • 18 Mar 2020 • Yo-Seb Jeon, Mohammad Mohammadi Amiri, Jun Li, H. Vincent Poor
One major challenge in system design is to reconstruct local gradient vectors accurately at the central server, which are computed-and-sent from the wireless devices.
no code implementations • 28 Jan 2020 • Mohammad Mohammadi Amiri, Deniz Gunduz, Sanjeev R. Kulkarni, H. Vincent Poor
At each iteration of FL, a subset of the devices are scheduled to transmit their local model updates to the PS over orthogonal channel resources, while each participating device must compress its model update to accommodate to its link capacity.
no code implementations • 23 Jul 2019 • Mohammad Mohammadi Amiri, Deniz Gunduz
Overall these results show clear advantages for the proposed analog over-the-air DSGD scheme, which suggests that learning and communication algorithms should be designed jointly to achieve the best end-to-end performance in machine learning applications at the wireless edge.
no code implementations • 8 Jul 2019 • Mohammad Mohammadi Amiri, Tolga M. Duman, Deniz Gunduz
At each iteration of the DSGD algorithm wireless devices compute gradient estimates with their local datasets, and send them to the PS over a wireless fading multiple access channel (MAC).
1 code implementation • 3 Jan 2019 • Mohammad Mohammadi Amiri, Deniz Gunduz
Following this digital approach, we introduce D-DSGD, in which the wireless devices employ gradient quantization and error accumulation, and transmit their gradient estimates to the PS over a multiple access channel (MAC).