no code implementations • 11 May 2025 • Hongwei Shang, Nguyen Vo, Nitin Yadav, Tian Zhang, Ajit Puthenputhussery, Xunfan Cai, Shuyi Chen, Prijith Chandran, Changsung Kang
To leverage the ranking power of LLMs while meeting the low-latency demands of production systems, we propose a novel framework that distills a high performing LLM into a more efficient, low-latency student model.
no code implementations • 25 Feb 2025 • Shuyi Chen, Ferdinando Fioretto, Feng Qiu, Shixiang Zhu
Extreme hazard events such as wildfires and hurricanes increasingly threaten power systems, causing widespread outages and disrupting critical services.
no code implementations • 17 Aug 2024 • Yingzhe Hui, Shuyi Chen, Yifan Qin, Weixiao Meng, Qiushi Zhang, Wei Jin
Reconfigurable Intelligent Surfaces (RIS) are programmable metasurfaces utilizing sub-wavelength meta-atoms and a controller for precise electromagnetic wave manipulation.
no code implementations • 26 Mar 2024 • Shuyi Chen, Shixiang Zhu
We propose a novel data pre-processing algorithm, Orthogonal to Bias (OB), which is designed to eliminate the influence of a group of continuous sensitive variables, thus promoting counterfactual fairness in machine learning applications.
no code implementations • 1 Dec 2023 • Shuyi Chen, Yingzhe Hui, Yifan Qin, Yueyi Yuan, Weixiao Meng, Xuewen Luo, Hsiao-Hwa Chen
Semantic communication has gained significant attention recently due to its advantages in achieving higher transmission efficiency by focusing on semantic information instead of bit-level information.
no code implementations • 23 Oct 2023 • Yihan Cao, Shuyi Chen, Ryan Liu, Zhiruo Wang, Daniel Fried
A persistent challenge to table question answering (TableQA) by generating executable programs has been adapting to varied table structures, typically requiring domain-specific logical forms.
no code implementations • 14 Jun 2023 • Shuyi Chen, Kaize Ding, Shixiang Zhu
Graph neural networks have shown impressive capabilities in solving various graph learning tasks, particularly excelling in node classification.
no code implementations • 7 Mar 2023 • Ankit Shah, Shuyi Chen, Kejun Zhou, Yue Chen, Bhiksha Raj
Preliminary results show (1) the proposed BECR can incur a more dispersed embedding on the test set, (2) BECR improves the PaSST model without extra computation complexity, and (3) STFT preprocessing outperforms CQT in all tasks we tested.
no code implementations • 9 Oct 2022 • Shuyi Chen, Bochao Zhao, Mingjun Zhong, Wenpeng Luan, Yixin Yu
Based on the NILM results in various cases, SSL generally outperforms zero-shot learning in improving load disaggregation performance without any sub-metering data from the target data sets.