Search Results for author: Yuanhang Yang

Found 6 papers, 4 papers with code

Enhancing Multivariate Time Series Forecasting with Mutual Information-driven Cross-Variable and Temporal Modeling

no code implementations1 Mar 2024 shiyi qi, Liangjian Wen, Yiduo Li, Yuanhang Yang, Zhe Li, Zhongwen Rao, Lujia Pan, Zenglin Xu

To substantiate this claim, we introduce the Cross-variable Decorrelation Aware feature Modeling (CDAM) for Channel-mixing approaches, aiming to refine Channel-mixing by minimizing redundant information between channels while enhancing relevant mutual information.

Multivariate Time Series Forecasting Time Series

Enhancing Efficiency in Sparse Models with Sparser Selection

no code implementations27 Feb 2024 Yuanhang Yang, shiyi qi, Wenchao Gu, Chaozheng Wang, Cuiyun Gao, Zenglin Xu

To address this issue, we present \tool, a novel MoE designed to enhance both the efficacy and efficiency of sparse MoE models.

Language Modelling Machine Translation

When Federated Learning Meets Pre-trained Language Models' Parameter-Efficient Tuning Methods

1 code implementation20 Dec 2022 Zhuo Zhang, Yuanhang Yang, Yong Dai, Lizhen Qu, Zenglin Xu

To facilitate the research of PETuning in FL, we also develop a federated tuning framework FedPETuning, which allows practitioners to exploit different PETuning methods under the FL training paradigm conveniently.

Federated Learning

Once is Enough: A Light-Weight Cross-Attention for Fast Sentence Pair Modeling

1 code implementation11 Oct 2022 Yuanhang Yang, shiyi qi, Chuanyi Liu, Qifan Wang, Cuiyun Gao, Zenglin Xu

Transformer-based models have achieved great success on sentence pair modeling tasks, such as answer selection and natural language inference (NLI).

Answer Selection Natural Language Inference +2

No More Fine-Tuning? An Experimental Evaluation of Prompt Tuning in Code Intelligence

1 code implementation24 Jul 2022 Chaozheng Wang, Yuanhang Yang, Cuiyun Gao, Yun Peng, Hongyu Zhang, Michael R. Lyu

Besides, the performance of fine-tuning strongly relies on the amount of downstream data, while in practice, the scenarios with scarce data are common.

Code Summarization Code Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.