1 code implementation • 21 Feb 2024 • Xinrong Zhang, Yingfa Chen, Shengding Hu, Zihang Xu, JunHao Chen, Moo Khai Hao, Xu Han, Zhen Leng Thai, Shuo Wang, Zhiyuan Liu, Maosong Sun
Processing and reasoning over long contexts is crucial for many practical applications of Large Language Models (LLMs), such as document comprehension and agent construction.
no code implementations • 2 Nov 2023 • Keyu Ding, Yongcan Wang, Zihang Xu, Zhenzhen Jia, Shijin Wang, Cong Liu, Enhong Chen
The results demonstrate that we have achieved state-of-the-art performance for the first time in the Full-mode Key-sequence to Characters(FK2C) task.
1 code implementation • 22 Oct 2023 • Zihang Xu, Haifan Gong, Xiang Wan, Haofeng Li
To address the task, we propose a new UDA framework based on Appearance and Structure Consistency, named ASC.
1 code implementation • 27 Jun 2023 • Zihang Xu, Ziqing Yang, Yiming Cui, Shijin Wang
IDOL achieves state-of-the-art performance on ReClor and LogiQA, the two most representative benchmarks in logical reasoning MRC, and is proven to be capable of generalizing to different pre-trained models and other types of MRC benchmarks like RACE and SQuAD 2. 0 while keeping competitive general language understanding ability through testing on tasks in GLUE.
Ranked #1 on Reading Comprehension on ReClor
1 code implementation • 24 Jul 2022 • Zihang Xu, Zhenghua Xu, Shuo Zhang, Thomas Lukasiewicz
Unlike most existing semi-supervised learning methods, adversarial training based methods distinguish samples from different sources by learning the data distribution of the segmentation map, leading the segmenter to generate more accurate predictions.
1 code implementation • SemEval (NAACL) 2022 • Zihang Xu, Ziqing Yang, Yiming Cui, Zhigang Chen
This paper describes our system designed for SemEval-2022 Task 8: Multilingual News Article Similarity.
no code implementations • COLING 2022 • Ziqing Yang, Zihang Xu, Yiming Cui, Baoxin Wang, Min Lin, Dayong Wu, Zhigang Chen
It covers Standard Chinese, Yue Chinese, and six other ethnic minority languages.