no code implementations • 5 Apr 2024 • Xinyu Ma, Xu Chu, Zhibang Yang, Yang Lin, Xin Gao, Junfeng Zhao
With the increasingly powerful performances and enormous scales of Pretrained Language Models (PLMs), promoting parameter efficiency in fine-tuning has become a crucial need for effective and efficient adaptation to various downstream tasks.
1 code implementation • 18 Jan 2024 • Ruizhe Zhang, Xinke Jiang, Yuchen Fang, Jiayuan Luo, Yongxin Xu, Yichen Zhu, Xu Chu, Junfeng Zhao, Yasha Wang
Graph Neural Networks (GNNs) have shown considerable effectiveness in a variety of graph learning tasks, particularly those based on the message-passing approach in recent years.
1 code implementation • 28 Dec 2023 • Zhihao Yu, Liantao Ma, Yasha Wang, Junfeng Zhao
In particular, a hierarchical convolution structure is introduced to extract the information from the series at various scales.
no code implementations • 26 Dec 2023 • Xinke Jiang, Ruizhe Zhang, Yongxin Xu, Rihong Qiu, Yue Fang, Zhiyuan Wang, Jinyi Tang, Hongxin Ding, Xu Chu, Junfeng Zhao, Yasha Wang
We explore how the rise of Large Language Models (LLMs) significantly impacts task performance in the field of Natural Language Processing.
no code implementations • 4 Oct 2023 • Hongxin Ding, Peinie Zou, Zhiyuan Wang, Junfeng Zhao, Yasha Wang, Qiang Zhou
Extracting medical knowledge from healthcare texts enhances downstream tasks like medical knowledge graph construction and clinical decision-making.
1 code implementation • 28 Oct 2022 • Chaohe Zhang, Xu Chu, Liantao Ma, Yinghao Zhu, Yasha Wang, Jiangtao Wang, Junfeng Zhao
M3Care is an end-to-end model compensating the missing information of the patients with missing modalities to perform clinical analysis.
no code implementations • 21 Apr 2022 • Xinyu Ma, Xu Chu, Yasha Wang, Hailong Yu, Liantao Ma, Wen Tang, Junfeng Zhao
Thus, to address the issues, we expect to group up strongly correlated features and learn feature correlations in a group-wise manner to reduce the learning complexity without losing generality.