no code implementations • ACL 2022 • Shuai Zhang, Yongliang Shen, Zeqi Tan, Yiquan Wu, Weiming Lu
Named entity recognition (NER) is a fundamental task to recognize specific types of entities from a given sentence.
no code implementations • 22 Apr 2024 • Xiaoxia Cheng, Zeqi Tan, Weiming Lu
In this paper, we propose an information re-organization (InfoRE) method before proceeding with the reasoning to enhance the reasoning ability of LLMs.
1 code implementation • 27 Feb 2024 • Wenqi Zhang, Ke Tang, Hai Wu, Mengna Wang, Yongliang Shen, Guiyang Hou, Zeqi Tan, Peng Li, Yueting Zhuang, Weiming Lu
Large Language Models exhibit robust problem-solving capabilities for diverse tasks.
no code implementations • 14 Oct 2023 • Wenqi Zhang, Yongliang Shen, Qingpeng Nong, Zeqi Tan, Yanna Ma, Weiming Lu
To generate a tree with expression as its node, we employ a layer-wise parallel decoding strategy: we decode multiple independent expressions (leaf nodes) in parallel at each layer and repeat parallel decoding layer by layer to sequentially generate these parent node expressions that depend on others.
Ranked #2 on Math Word Problem Solving on MathQA
1 code implementation • 12 Oct 2023 • Shuhui Wu, Yongliang Shen, Zeqi Tan, Wenqi Ren, Jietian Guo, ShiLiang Pu, Weiming Lu
Distantly supervised named entity recognition (DS-NER) aims to locate entity mentions and classify their types with only knowledge bases or gazetteers and unlabeled corpus.
1 code implementation • 26 May 2023 • Yongliang Shen, Zeqi Tan, Shuhui Wu, Wenqi Zhang, Rongsheng Zhang, Yadong Xi, Weiming Lu, Yueting Zhuang
Prompt learning is a new paradigm for utilizing pre-trained language models and has achieved great success in many tasks.
Ranked #1 on Nested Named Entity Recognition on ACE 2004
no code implementations • 26 May 2023 • Xuming Hu, Aiwei Liu, Zeqi Tan, Xin Zhang, Chenwei Zhang, Irwin King, Philip S. Yu
These techniques neither preserve the semantic consistency of the original sentences when rule-based augmentations are adopted, nor preserve the syntax structure of sentences when expressing relations using seq2seq models, resulting in less diverse augmentations.
1 code implementation • 5 May 2023 • Zeqi Tan, Shen Huang, Zixia Jia, Jiong Cai, Yinghui Li, Weiming Lu, Yueting Zhuang, Kewei Tu, Pengjun Xie, Fei Huang, Yong Jiang
Also, we discover that the limited context length causes the retrieval knowledge to be invisible to the model.
Multilingual Named Entity Recognition named-entity-recognition +4
no code implementations • 3 Nov 2022 • Zeqi Tan, Yongliang Shen, Xuming Hu, Wenqi Zhang, Xiaoxia Cheng, Weiming Lu, Yueting Zhuang
Joint entity and relation extraction has been a core task in the field of information extraction.
Contrastive Learning Joint Entity and Relation Extraction +1
1 code implementation • 21 Oct 2022 • Wenqi Zhang, Yongliang Shen, Yanna Ma, Xiaoxia Cheng, Zeqi Tan, Qingpeng Nong, Weiming Lu
Math word problem solver requires both precise relation reasoning about quantities in the text and reliable generation for the diverse equation.
Ranked #1 on Math Word Problem Solving on Math23K (using extra training data)
1 code implementation • 27 Apr 2022 • Shuhui Wu, Yongliang Shen, Zeqi Tan, Weiming Lu
In the refine stage, proposals interact with each other, and richer contextual information is incorporated into the proposal representations.
1 code implementation • ACL 2022 • Yongliang Shen, Xiaobin Wang, Zeqi Tan, Guangwei Xu, Pengjun Xie, Fei Huang, Weiming Lu, Yueting Zhuang
Each instance query predicts one entity, and by feeding all instance queries simultaneously, we can query all entities in parallel.
Ranked #1 on Nested Named Entity Recognition on GENIA
Chinese Named Entity Recognition named-entity-recognition +5
1 code implementation • ACL 2021 • Xin Xin, Jinlong Li, Zeqi Tan
In this paper, we study the task of graph-based constituent parsing in the setting that binarization is not conducted as a pre-processing step, where a constituent tree may consist of nodes with more than two children.
Ranked #3 on Constituency Parsing on CTB5
1 code implementation • 19 May 2021 • Zeqi Tan, Yongliang Shen, Shuai Zhang, Weiming Lu, Yueting Zhuang
We utilize a non-autoregressive decoder to predict the final set of entities in one pass, in which we are able to capture dependencies between entities.
Ranked #6 on Nested Named Entity Recognition on ACE 2005
1 code implementation • ACL 2021 • Yongliang Shen, Xinyin Ma, Zeqi Tan, Shuai Zhang, Wen Wang, Weiming Lu
Although these methods have the innate ability to handle nested NER, they suffer from high computational cost, ignorance of boundary information, under-utilization of the spans that partially match with entities, and difficulties in long entity recognition.
Ranked #6 on Nested Named Entity Recognition on GENIA
Chinese Named Entity Recognition named-entity-recognition +3