no code implementations • 16 Feb 2024 • Songjie Xie, Youlong Wu, Jiaxuan Li, Ming Ding, Khaled B. Letaief
Based on the proposed method, we further develop a variational representation encoding approach that simultaneously achieves fairness and LDP.
no code implementations • 18 Oct 2023 • Yuhan Yang, Youlong Wu, Yuning Jiang, Yuanming Shi
Distributed learning has become a promising computational parallelism paradigm that enables a wide scope of intelligent applications from the Internet of Things (IoT) to autonomous driving and the healthcare industry.
no code implementations • 3 Mar 2023 • Shuai Ma, Weining Qiao, Youlong Wu, Hang Li, Guangming Shi, Dahua Gao, Yuanming Shi, Shiyin Li, Naofal Al-Dhahir
Instead of broadcasting all extracted features, the semantic encoder extracts the disentangled semantic features, and then only the users' intended semantic features are selected for broadcasting, which can further improve the transmission efficiency.
no code implementations • 27 Feb 2023 • Shuai Ma, Weining Qiao, Youlong Wu, Hang Li, Guangming Shi, Dahua Gao, Yuanming Shi, Shiyin Li, Naofal Al-Dhahir
Furthermore, based on the $\beta $-variational autoencoder ($\beta $-VAE), we propose a practical explainable semantic communication system design, which simultaneously achieves semantic features selection and is robust against semantic channel noise.
no code implementations • 21 Dec 2022 • Shuai Ma, Jing Wang, Chun Du, Hang Li, Xiaodong Liu, Youlong Wu, Naofal Al-Dhahir, Shiyin Li
To address this challenge, we propose an alternating optimization algorithm to obtain the transmit beamforming and the PD orientation.
1 code implementation • 21 Sep 2022 • Songjie Xie, Shuai Ma, Ming Ding, Yuanming Shi, Mingjian Tang, Youlong Wu
Task-oriented communications, mostly using learning-based joint source-channel coding (JSCC), aim to design a communication-efficient edge inference system by transmitting task-relevant information to the receiver.
1 code implementation • 31 Mar 2022 • Yuhan Yang, Yong Zhou, Youlong Wu, Yuanming Shi
Federated learning (FL), as a disruptive machine learning paradigm, enables the collaborative training of a global model over decentralized local datasets without sharing them.
no code implementations • 16 Nov 2021 • Kai Liang, Huiru Zhong, Haoning Chen, Youlong Wu
Due to limited communication resources at the client and a massive number of model parameters, large-scale distributed learning tasks suffer from communication bottleneck.
no code implementations • 4 Feb 2021 • Kai Liang, Youlong Wu
We propose a practical and efficient estimator based on an r-bit Wynzer-Ziv estimator proposed by Mayekar et al., which requires no probabilistic assumption on the data.
Information Theory Information Theory
no code implementations • 2 Feb 2021 • Shu-Jie Cao, Lihui Yi, Haoning Chen, Youlong Wu
In this paper, we study the resource allocation and coding scheme for the MapReduce-type framework with limited resources.
Distributed Computing Information Theory Distributed, Parallel, and Cluster Computing Information Theory