no code implementations • 18 Mar 2024 • Xiangyu Chen, Jing Liu, Ye Wang, Pu, Wang, Matthew Brand, Guanghui Wang, Toshiaki Koike-Akino
Low-rank adaptation (LoRA) and its variants are widely employed in fine-tuning large models, including large language models for natural language processing and diffusion models for computer vision.
no code implementations • 23 Aug 2023 • Xingyue, Pu, Stefan Zohren, Stephen Roberts, Xiaowen Dong
Network momentum provides a novel type of risk premium, which exploits the interconnections among assets in a financial network to predict future returns.
no code implementations • 22 Aug 2023 • Xingyue, Pu, Stephen Roberts, Xiaowen Dong, Stefan Zohren
We investigate the concept of network momentum, a novel trading signal derived from momentum spillover across assets.
no code implementations • 20 Sep 2022 • Haifeng Xia, Pu, Wang, Toshiaki Koike-Akino, Ye Wang, Philip Orlik, Zhengming Ding
Domain adaptation (DA) aims to transfer the knowledge of a well-labeled source domain to facilitate unlabeled target learning.
no code implementations • 19 Feb 2022 • Jianyuan Yu, Pu, Wang, Toshiaki Koike-Akino, Philip V. Orlik
This paper considers indoor localization using multi-modal wireless signals including Wi-Fi, inertial measurement unit (IMU), and ultra-wideband (UWB).
no code implementations • 28 Dec 2021 • Jianyuan Yu, Pu, Wang, Toshiaki Koike-Akino, Ye Wang, Philip V. Orlik, R. Michael Buehrer
The granularity matching is realized by pairing two feature maps from the CSI and beam SNR at different granularity levels and linearly combining all paired feature maps into a fused feature map with learnable weights.