no code implementations • 11 Apr 2024 • Sara Cavallero, Fabio Saggese, Junya Shiraishi, Shashi Raj Pandey, Chiara Buratti, Petar Popovski
We consider a setup with Internet of Things (IoT), where a base station (BS) collects data from nodes that use two different communication modes.
1 code implementation • 14 Dec 2023 • Pranava Singhal, Shashi Raj Pandey, Petar Popovski
The standard client selection algorithms for Federated Learning (FL) are often unbiased and involve uniform random sampling of clients.
no code implementations • 5 Dec 2023 • Shashi Raj Pandey, Pierre Pinson, Petar Popovski
Data markets facilitate decentralized data exchange for applications such as prediction, learning, or inference.
no code implementations • 27 Nov 2023 • Van-Phuc Bui, Shashi Raj Pandey, Pedro M. de Sant Ana, Petar Popovski
The setup considered in the paper consists of sensors in a Networked Control System that are used to build a digital twin (DT) model of the system dynamics.
no code implementations • 14 Nov 2023 • Victor Croisfelt, Shashi Raj Pandey, Osvaldo Simeone, Petar Popovski
Conventional retransmission (ARQ) protocols are designed with the goal of ensuring the correct reception of all the individual transmitter's packets at the receiver.
no code implementations • 19 May 2023 • Shashi Raj Pandey, Van Phuc Bui, Petar Popovski
A client has to decide its transmission plan on when not to participate in FL.
no code implementations • 17 May 2023 • Van-Phuc Bui, Thinh Q. Dinh, Israel Leyva-Mayorga, Shashi Raj Pandey, Eva Lagunas, Petar Popovski
The amount of data generated by Earth observation satellites can be enormous, which poses a great challenge to the satellite-to-ground connections with limited rate.
no code implementations • 3 Feb 2023 • Van-Phuc Bui, Shashi Raj Pandey, Andreas Casparsen, Federico Chiariotti, Petar Popovski
From the viewpoint of communication and networking, this will represent an evolution of the game networking technology, designed to interconnect massive users in real-time online gaming environments.
no code implementations • 26 Jan 2023 • Van-Phuc Bui, Shashi Raj Pandey, Federico Chiariotti, Petar Popovski
This paper presents an approach to schedule observations from different sensors in an environment to ensure their timely delivery and build a digital twin (DT) model of the system dynamics.
no code implementations • 20 Sep 2022 • Shashi Raj Pandey, Lam Duc Nguyen, Petar Popovski
Incentives that compensate for the involved costs in the decentralized training of a Federated Learning (FL) model act as a key stimulus for clients' long-term participation.
no code implementations • 15 Jun 2022 • Shashi Raj Pandey, Pierre Pinson, Petar Popovski
Finally, we reveal the structure of the formulated problem as a distributed coalition game and solve it following the simplified split-and-merge algorithm.
no code implementations • 4 Apr 2022 • Minh N. H. Nguyen, Huy Q. Le, Shashi Raj Pandey, Choong Seon Hong
Therefore, to develop robust generalized global and personalized models, conventional FL methods need redesigning the knowledge aggregation from biased local models while considering huge divergence of learning parameters due to skewed client data.
no code implementations • 10 Mar 2022 • Shashi Raj Pandey, Lam D. Nguyen, Petar Popovski
In a Federated Learning (FL) setup, a number of devices contribute to the training of a common model.
no code implementations • 6 Dec 2021 • Lam Duc Nguyen, Shashi Raj Pandey, Soret Beatriz, Arne Broering, Petar Popovski
An emerging paradigm in ML is a federated approach where the learning model is delivered to a group of heterogeneous agents partially, allowing agents to train the model locally with their own data.
no code implementations • 1 Dec 2020 • Shashi Raj Pandey, Minh N. H. Nguyen, Tri Nguyen Dang, Nguyen H. Tran, Kyi Thar, Zhu Han, Choong Seon Hong
Therefore, we need to design a robust learning mechanism than the FL that (i) unleashes a viable infrastructure for FA and (ii) trains learning models with better generalization capability.
no code implementations • 22 Sep 2020 • Tra Huong Thi Le, Nguyen H. Tran, Yan Kyaw Tun, Minh N. H. Nguyen, Shashi Raj Pandey, Zhu Han, Choong Seon Hong
In this paper, we consider a FL system that involves one base station (BS) and multiple mobile users.
1 code implementation • 7 Jul 2020 • Minh N. H. Nguyen, Shashi Raj Pandey, Tri Nguyen Dang, Eui-Nam Huh, Nguyen H. Tran, Walid Saad, Choong Seon Hong
Inspired by Dem-AI philosophy, a novel distributed learning approach is proposed in this paper.
1 code implementation • 18 Mar 2020 • Minh N. H. Nguyen, Shashi Raj Pandey, Kyi Thar, Nguyen H. Tran, Mingzhe Chen, Walid Saad, Choong Seon Hong
Consequently, many emerging cross-device AI applications will require a transition from traditional centralized learning systems towards large-scale distributed AI systems that can collaboratively perform multiple complex learning tasks.
no code implementations • 6 Nov 2019 • Latif U. Khan, Nguyen H. Tran, Shashi Raj Pandey, Walid Saad, Zhu Han, Minh N. H. Nguyen, Choong Seon Hong
IoT devices with intelligence require the use of effective machine learning paradigms.
Distributed, Parallel, and Cluster Computing
no code implementations • 4 Nov 2019 • Shashi Raj Pandey, Nguyen H. Tran, Mehdi Bennis, Yan Kyaw Tun, Aunas Manzoor, Choong Seon Hong
Federated learning (FL) rests on the notion of training a global model in a decentralized manner.