1 code implementation • 25 Jan 2024 • Vasileios Tsouvalas, Aaqib Saeed, Tanir Ozcelebi, Nirvana Meratnia
Federated Learning (FL) is a promising technique for the collaborative training of deep neural networks across multiple devices while preserving data privacy.
no code implementations • 15 Nov 2023 • Saeed Khalilian, Vasileios Tsouvalas, Tanir Ozcelebi, Nirvana Meratnia
To ensure a smooth learning curve and proper calibration of clusters between the server and the clients, FedCode periodically transfers model weights after multiple rounds of solely communicating codebooks.
no code implementations • 30 Oct 2022 • Bram van Berlo, Yang Miao, Rizqi Hersyandika, Nirvana Meratnia, Tanir Ozcelebi, Andre Kokkeler, Sofie Pollin
Joint Communication and Sensing (JCAS) is envisioned for 6G cellular networks, where sensing the operation environment, especially in presence of humans, is as important as the high-speed wireless connectivity.
1 code implementation • 19 Aug 2022 • Vasileios Tsouvalas, Aaqib Saeed, Tanir Ozcelebi, Nirvana Meratnia
Federated Learning (FL) is a distributed machine learning paradigm that enables learning models from decentralized private datasets, where the labeling effort is entrusted to the clients.
1 code implementation • 5 Feb 2022 • Vasileios Tsouvalas, Tanir Ozcelebi, Nirvana Meratnia
To the best of our knowledge, this is the first federated SER approach, which utilizes self-training learning in conjunction with federated learning to exploit both labeled and unlabeled on-device data.
1 code implementation • 14 Jul 2021 • Vasileios Tsouvalas, Aaqib Saeed, Tanir Ozcelebi
Notably, we show that with as little as 3% labeled data available, FedSTAR on average can improve the recognition rate by 13. 28% compared to the fully supervised federated model.
no code implementations • 26 Dec 2020 • Bram van Berlo, Amany Elkelany, Tanir Ozcelebi, Nirvana Meratnia
The increasing bandwidth requirement of new wireless applications has lead to standardization of the millimeter wave spectrum for high-speed wireless communication.
no code implementations • 25 Jul 2020 • Aaqib Saeed, Flora D. Salim, Tanir Ozcelebi, Johan Lukkien
Federated learning provides a compelling framework for learning models from decentralized data, but conventionally, it assumes the availability of labeled samples, whereas on-device data are generally either unlabeled or cannot be annotated readily through user interaction.
no code implementations • 27 Jul 2019 • Aaqib Saeed, Tanir Ozcelebi, Johan Lukkien
We learn a multi-task temporal convolutional network to recognize transformations applied on an input signal.
no code implementations • 27 Aug 2018 • Aaqib Saeed, Tanir Ozcelebi, Stojan Trajanovski, Johan Lukkien
In this paper, we propose a multi-stream temporal convolutional network to address the problem of multi-label behavioral context recognition.
no code implementations • 28 Mar 2013 • Aravind Kota Gopalakrishna, Tanir Ozcelebi, Antonio Liotta, Johan J. Lukkien
In machine learning, the choice of a learning algorithm that is suitable for the application domain is critical.