no code implementations • 22 Mar 2024 • Xuemei Tang, Zekun Deng, Qi Su, Hao Yang, Jun Wang
Additionally, we have evaluated the capabilities of Large Language Models (LLMs) in the context of tasks related to ancient Chinese history.
no code implementations • 22 Feb 2024 • Xuemei Tang, Jun Wang, Qi Su
Recently, large language models (LLMs) have been successful in relational extraction (RE) tasks, especially in the few-shot learning.
no code implementations • 21 Feb 2024 • Xuemei Tang, Qi Su
To address this challenge, we propose a two-stage curriculum learning (TCL) framework specifically designed for sequence labeling tasks.
no code implementations • 3 Jun 2023 • Xuemei Tang, Jun Wang, Qi Su
Recently, it is quite common to integrate Chinese sequence labeling results to enhance syntactic and semantic parsing.
no code implementations • ACL 2022 • Xuemei Tang, Qi Su, Jun Wang
The evolution of language follows the rule of gradual change.
no code implementations • 22 Jan 2022 • Xuemei Tang, Jun Wang, Qi Su
In recent years, deep learning has achieved significant success in the Chinese word segmentation (CWS) task.