1 code implementation • 25 Jan 2024 • Vasileios Tsouvalas, Aaqib Saeed, Tanir Ozcelebi, Nirvana Meratnia
Federated Learning (FL) is a promising technique for the collaborative training of deep neural networks across multiple devices while preserving data privacy.
no code implementations • 15 Nov 2023 • Saeed Khalilian, Vasileios Tsouvalas, Tanir Ozcelebi, Nirvana Meratnia
To ensure a smooth learning curve and proper calibration of clusters between the server and the clients, FedCode periodically transfers model weights after multiple rounds of solely communicating codebooks.
no code implementations • 30 Oct 2022 • Bram van Berlo, Yang Miao, Rizqi Hersyandika, Nirvana Meratnia, Tanir Ozcelebi, Andre Kokkeler, Sofie Pollin
Joint Communication and Sensing (JCAS) is envisioned for 6G cellular networks, where sensing the operation environment, especially in presence of humans, is as important as the high-speed wireless connectivity.
1 code implementation • 19 Aug 2022 • Vasileios Tsouvalas, Aaqib Saeed, Tanir Ozcelebi, Nirvana Meratnia
Federated Learning (FL) is a distributed machine learning paradigm that enables learning models from decentralized private datasets, where the labeling effort is entrusted to the clients.
1 code implementation • 5 Feb 2022 • Vasileios Tsouvalas, Tanir Ozcelebi, Nirvana Meratnia
To the best of our knowledge, this is the first federated SER approach, which utilizes self-training learning in conjunction with federated learning to exploit both labeled and unlabeled on-device data.
no code implementations • 26 Dec 2020 • Bram van Berlo, Amany Elkelany, Tanir Ozcelebi, Nirvana Meratnia
The increasing bandwidth requirement of new wireless applications has lead to standardization of the millimeter wave spectrum for high-speed wireless communication.