no code implementations • 21 Nov 2024 • Yunrui Sun, Gang Hu, Yinglei Teng, Dunbo Cai
Split Learning (SL) is a promising collaborative machine learning approach, enabling resource-constrained devices to train models without sharing raw data, while reducing computational load and preserving privacy simultaneously.
no code implementations • 14 Jun 2024 • Gang Hu, Yinglei Teng, Nan Wang, Zhu Han
Federated Edge Learning (FEEL) emerges as a pioneering distributed machine learning paradigm for the 6G Hyper-Connectivity, harnessing data from the Internet of Things (IoT) devices while upholding data privacy.
no code implementations • journal 2024 • Gang Hu, Zaidao Wen, Yafei Lv, Jianting Zhang, Qian Wu
To mitigate semantic ambiguities during the alignment of local features, we design a local information soft-alignment (LISA) module.
Ranked #3 on
Cross-Modal Retrieval
on RSICD
no code implementations • 8 May 2024 • Gang Hu, Ming Gu
This paper introduces a hybrid approach combining Markowitz's portfolio theory with reinforcement learning, utilizing knowledge distillation for training agents.
2 code implementations • 10 Mar 2024 • Gang Hu, Ke Qin, Chenhan Yuan, Min Peng, Alejandro Lopez-Lira, Benyou Wang, Sophia Ananiadou, Jimin Huang, Qianqian Xie
While the progression of Large Language Models (LLMs) has notably propelled financial analysis, their application has largely been confined to singular language realms, leaving untapped the potential of bilingual Chinese-English capacity.
2 code implementations • 20 Feb 2024 • Qianqian Xie, Weiguang Han, Zhengyu Chen, Ruoyu Xiang, Xiao Zhang, Yueru He, Mengxi Xiao, Dong Li, Yongfu Dai, Duanyu Feng, Yijing Xu, Haoqiang Kang, Ziyan Kuang, Chenhan Yuan, Kailai Yang, Zheheng Luo, Tianlin Zhang, Zhiwei Liu, Guojun Xiong, Zhiyang Deng, Yuechen Jiang, Zhiyuan Yao, Haohang Li, Yangyang Yu, Gang Hu, Jiajia Huang, Xiao-Yang Liu, Alejandro Lopez-Lira, Benyou Wang, Yanzhao Lai, Hao Wang, Min Peng, Sophia Ananiadou, Jimin Huang
Our evaluation of 15 representative LLMs, including GPT-4, ChatGPT, and the latest Gemini, reveals several key findings: While LLMs excel in IE and textual analysis, they struggle with advanced reasoning and complex tasks like text generation and forecasting.
no code implementations • 9 Nov 2023 • Gang Hu
This study enhances a Deep Q-Network (DQN) trading model by incorporating advanced techniques like Prioritized Experience Replay, Regularized Q-Learning, Noisy Networks, Dueling, and Double DQN.
1 code implementation • 5 Jul 2023 • Qiulei Wang, Lei Yan, Gang Hu, Wenli Chen, Jean Rabault, Bernd R. Noack
The resulting dynamic feature-based DRL (DF-DRL) automatically learns a feedback control in the plant without a dynamic model.
1 code implementation • 7 Apr 2023 • Yiyuan Yang, Rongshang Li, Qiquan Shi, Xijun Li, Gang Hu, Xing Li, Mingxuan Yuan
This paper proposes a novel Stream-Graph neural network-based Data Prefetcher (SGDP).
no code implementations • 17 Feb 2023 • Gang Hu, Yinglei Teng, Nan Wang, F. Richard Yu
Federated Learning (FL) is a novel distributed machine learning approach to leverage data from Internet of Things (IoT) devices while maintaining data privacy.
no code implementations • 20 Aug 2019 • Gang Hu, Lingbo Liu, DaCheng Tao, Jie Song, K. C. S. Kwok
This study used machine learning techniques to resolve the conflicting requirement between limited wind tunnel tests that produce unreliable results and a completed investigation of the interference effects that is costly and time-consuming.
no code implementations • 21 Jan 2019 • Gang Hu, K. C. S. Kwok
Numerous studies have been carried out to measure wind pressures around circular cylinders since the early 20th century due to its engineering significance.