no code implementations • 10 Aug 2024 • Fengyu Wang, Yuan Zheng, Wenjun Xu, Junxiao Liang, Ping Zhang
Compared to traditional communication systems that focus on the accurate reconstruction of bit sequences, semantic communications (SemComs), which aim to successfully deliver information connotation, have been regarded as the key technology for next-generation communication systems.
no code implementations • 12 Apr 2024 • Jiangjing Hu, Fengyu Wang, Wenjun Xu, Hui Gao, Ping Zhang
Intelligent task-oriented semantic communications (SemComs) have witnessed great progress with the development of deep learning (DL).
1 code implementation • 3 Apr 2024 • Haoran Sun, Lixin Liu, Junjie Li, Fengyu Wang, Baohua Dong, Ran Lin, Ruohui Huang
To address this challenge, we introduce Conifer, a novel instruction tuning dataset, designed to enhance LLMs to follow multi-level instructions with complex constraints.
no code implementations • 12 Aug 2023 • Yuan Zheng, Fengyu Wang, Wenjun Xu, Miao Pan, Ping Zhang
Semantic communications, aiming at ensuring the successful delivery of the meaning of information, are expected to be one of the potential techniques for the next generation communications.
no code implementations • 28 Jun 2023 • Lanhua Xiang, Fengyu Wang, Wenjun Xu, Tiankui Zhang, Miao Pan, Zhu Han
First, a cluster-evolutionary target association (CETA) algorithm is proposed, which involves dividing the UAV swarm into the multiple sub-swarms and individually matching these sub-swarms to targets.
no code implementations • 12 Apr 2023 • Jiangjing Hu, Fengyu Wang, Wenjun Xu, Hui Gao, Ping Zhang
Semantic communications are expected to be an innovative solution to the emerging intelligent applications in the era of connected intelligence.
no code implementations • 6 Dec 2022 • Wenjun Xu, Yimeng Zhang, Fengyu Wang, Zhijin Qin, Chenyao Liu, Ping Zhang
Internet of Vehicles (IoV) is expected to become the central infrastructure to provide advanced services to connected vehicles and users for higher transportation efficiency and security.
no code implementations • ACL 2022 • Cheng Chen, Yichun Yin, Lifeng Shang, Xin Jiang, Yujia Qin, Fengyu Wang, Zhi Wang, Xiao Chen, Zhiyuan Liu, Qun Liu
However, large language model pre-training costs intensive computational resources and most of the models are trained from scratch without reusing the existing pre-trained models, which is wasteful.
1 code implementation • Findings (ACL) 2021 • Fanchao Qi, Yangyi Chen, Fengyu Wang, Zhiyuan Liu, Xiao Chen, Maosong Sun
We use this method to build an English SKB and a French SKB, and conduct comprehensive evaluations from both intrinsic and extrinsic perspectives.