1 code implementation • 21 Aug 2023 • Tianyu Yu, Chengyue Jiang, Chao Lou, Shen Huang, Xiaobin Wang, Wei Liu, Jiong Cai, Yangning Li, Yinghui Li, Kewei Tu, Hai-Tao Zheng, Ningyu Zhang, Pengjun Xie, Fei Huang, Yong Jiang
However, LLMs are sometimes too footloose for natural language understanding (NLU) tasks which always have restricted output and input format.
no code implementations • 1 Jul 2023 • Jiong Cai, Yong Jiang, Yue Zhang, Chengyue Jiang, Ke Yu, Jianhui Ji, Rong Xiao, Haihong Tang, Tao Wang, Zhongqiang Huang, Pengjun Xie, Fei Huang, Kewei Tu
We also show that pretraining the QE module with auto-generated QE data from user logs can further improve the overall performance.
1 code implementation • 5 May 2023 • Zeqi Tan, Shen Huang, Zixia Jia, Jiong Cai, Yinghui Li, Weiming Lu, Yueting Zhuang, Kewei Tu, Pengjun Xie, Fei Huang, Yong Jiang
Also, we discover that the limited context length causes the retrieval knowledge to be invisible to the model.
Multilingual Named Entity Recognition named-entity-recognition +4
1 code implementation • 3 Dec 2022 • Xinyu Wang, Jiong Cai, Yong Jiang, Pengjun Xie, Kewei Tu, Wei Lu
MoRe contains a text retrieval module and an image-based retrieval module, which retrieve related knowledge of the input text and image in the knowledge corpus respectively.
Ranked #1 on Multi-modal Named Entity Recognition on SNAP (MNER)
Multi-modal Named Entity Recognition Named Entity Recognition +3
1 code implementation • SemEval (NAACL) 2022 • Xinyu Wang, Yongliang Shen, Jiong Cai, Tao Wang, Xiaobin Wang, Pengjun Xie, Fei Huang, Weiming Lu, Yueting Zhuang, Kewei Tu, Wei Lu, Yong Jiang
Our system wins 10 out of 13 tracks in the MultiCoNER shared task.
Multilingual Named Entity Recognition Named Entity Recognition +1
no code implementations • COLING 2020 • Ruyue Hong, Jiong Cai, Kewei Tu
Deep inside-outside recursive autoencoder (DIORA) is a neural-based model designed for unsupervised constituency parsing.
no code implementations • ACL 2020 • Jun Li, Yifan Cao, Jiong Cai, Yong Jiang, Kewei Tu
Unsupervised constituency parsing aims to learn a constituency parser from a training corpus without parse tree annotations.
1 code implementation • ACL 2020 • Zixia Jia, Youmi Ma, Jiong Cai, Kewei Tu
Semantic dependency parsing, which aims to find rich bi-lexical relationships, allows words to have multiple dependency heads, resulting in graph-structured representations.
1 code implementation • EMNLP 2017 • Jiong Cai, Yong Jiang, Kewei Tu
The encoder part of our model is discriminative and globally normalized which allows us to use rich features as well as universal linguistic priors.
Dependency Grammar Induction Unsupervised Dependency Parsing