Search Results for author: Ju Wang

Found 7 papers, 0 papers with code

Toward a Better Understanding of Fourier Neural Operators: Analysis and Improvement from a Spectral Perspective

no code implementations10 Apr 2024 Shaoxiang Qin, Fuyuan Lyu, Wenhui Peng, Dingyang Geng, Ju Wang, Naiping Gao, Xue Liu, Liangzhu Leon Wang

In solving partial differential equations (PDEs), Fourier Neural Operators (FNOs) have exhibited notable effectiveness compared to Convolutional Neural Networks (CNNs).

Ensemble Learning

Teacher-Student Architecture for Knowledge Distillation: A Survey

no code implementations8 Aug 2023 Chengming Hu, Xuan Li, Dan Liu, Haolun Wu, Xi Chen, Ju Wang, Xue Liu

Recently, Teacher-Student architectures have been effectively and widely embraced on various knowledge distillation (KD) objectives, including knowledge compression, knowledge expansion, knowledge adaptation, and knowledge enhancement.

Knowledge Distillation regression

Multi-agent Attention Actor-Critic Algorithm for Load Balancing in Cellular Networks

no code implementations14 Mar 2023 Jikun Kang, Di wu, Ju Wang, Ekram Hossain, Xue Liu, Gregory Dudek

In cellular networks, User Equipment (UE) handoff from one Base Station (BS) to another, giving rise to the load balancing problem among the BSs.

Teacher-Student Architecture for Knowledge Learning: A Survey

no code implementations28 Oct 2022 Chengming Hu, Xuan Li, Dan Liu, Xi Chen, Ju Wang, Xue Liu

To tackle this issue, Teacher-Student architectures were first utilized in knowledge distillation, where simple student networks can achieve comparable performance to deep teacher networks.

Knowledge Distillation Multi-Task Learning

Detecting Soil Moisture Levels Using Battery-Free Wi-Fi Tag

no code implementations4 Feb 2022 Wenli Jiao, Ju Wang, Yelu He, Xiangdong Xi, Xiaojiang Chen

In this paper, we design and implement a high-accuracy and low cost chipless soil moisture sensing system called SoilTAG.

TAG

MOBA: Multi-teacher Model Based Reinforcement Learning

no code implementations29 Sep 2021 Jikun Kang, Xi Chen, Ju Wang, Chengming Hu, Xue Liu, Gregory Dudek

Results show that, compared with SOTA model-free methods, our method can improve the data efficiency and system performance by up to 75% and 10%, respectively.

Decision Making Knowledge Distillation +4

Cannot find the paper you are looking for? You can Submit a new open access paper.