Search Results for author: S. H. Song

Found 9 papers, 4 papers with code

Client Selection for Federated Policy Optimization with Environment Heterogeneity

1 code implementation18 May 2023 Zhijie Xie, S. H. Song

This paper investigates the federated version of Approximate PI (API) and derives its error bound, taking into account the approximation error introduced by environment heterogeneity.

Policy Gradient Methods Reinforcement Learning (RL)

Augmented Deep Unfolding for Downlink Beamforming in Multi-cell Massive MIMO With Limited Feedback

no code implementations3 Sep 2022 Yifan Ma, Xianghao Yu, Jun Zhang, S. H. Song, Khaled B. Letaief

In limited feedback multi-user multiple-input multiple-output (MU-MIMO) cellular networks, users send quantized information about the channel conditions to the associated base station (BS) for downlink beamforming.

Quantization

Intelligent Reflecting Surface-Aided Maneuvering Target Sensing: True Velocity Estimation

no code implementations30 Jul 2022 Lei Xie, Xianghao Yu, S. H. Song

Maneuvering target sensing will be an important service of future vehicular networks, where precise velocity estimation is one of the core tasks.

FedKL: Tackling Data Heterogeneity in Federated Reinforcement Learning by Penalizing KL Divergence

1 code implementation18 Apr 2022 Zhijie Xie, S. H. Song

A necessary condition for the global policy to be learn-able from the local policy is also derived, which is directly related to the heterogeneity level.

Federated Learning reinforcement-learning +1

Graph Neural Networks for Wireless Communications: From Theory to Practice

1 code implementation21 Mar 2022 Yifei Shen, Jun Zhang, S. H. Song, Khaled B. Letaief

For design guidelines, we propose a unified framework that is applicable to general design problems in wireless networks, which includes graph modeling, neural architecture design, and theory-guided performance enhancement.

Communication-Efficient Federated Distillation with Active Data Sampling

no code implementations14 Mar 2022 Lumin Liu, Jun Zhang, S. H. Song, Khaled B. Letaief

Federated Distillation (FD) is a recently proposed alternative to enable communication-efficient and robust FL, which achieves orders of magnitude reduction of the communication overhead compared with FedAvg and is flexible to handle heterogeneous models at the clients.

Federated Learning Privacy Preserving +1

Learn to Communicate with Neural Calibration: Scalability and Generalization

no code implementations1 Oct 2021 Yifan Ma, Yifei Shen, Xianghao Yu, Jun Zhang, S. H. Song, Khaled B. Letaief

Furthermore, such networks will vary dynamically in a significant way, which makes it intractable to develop comprehensive analytical models.

Computational Efficiency Management

Neural Calibration for Scalable Beamforming in FDD Massive MIMO with Implicit Channel Estimation

no code implementations3 Aug 2021 Yifan Ma, Yifei Shen, Xianghao Yu, Jun Zhang, S. H. Song, Khaled B. Letaief

Channel estimation and beamforming play critical roles in frequency-division duplexing (FDD) massive multiple-input multiple-output (MIMO) systems.

Client-Edge-Cloud Hierarchical Federated Learning

1 code implementation16 May 2019 Lumin Liu, Jun Zhang, S. H. Song, Khaled B. Letaief

To combine their advantages, we propose a client-edge-cloud hierarchical Federated Learning system, supported with a HierFAVG algorithm that allows multiple edge servers to perform partial model aggregation.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.