no code implementations • Findings (NAACL) 2022 • Liwen Zhang, Zixia Jia, Wenjuan Han, Zilong Zheng, Kewei Tu
Adversarial attack of structured prediction models faces various challenges such as the difficulty of perturbing discrete words, the sentence quality issue, and the sensitivity of outputs to small perturbations.
1 code implementation • 13 Nov 2023 • Junpeng Li, Zixia Jia, Zilong Zheng
Document-level Relation Extraction (DocRE), which aims to extract relations from a long context, is a critical challenge in achieving fine-grained structural comprehension and generating interpretable document representations.
1 code implementation • 5 May 2023 • Zeqi Tan, Shen Huang, Zixia Jia, Jiong Cai, Yinghui Li, Weiming Lu, Yueting Zhuang, Kewei Tu, Pengjun Xie, Fei Huang, Yong Jiang
Also, we discover that the limited context length causes the retrieval knowledge to be invisible to the model.
Multilingual Named Entity Recognition named-entity-recognition +4
1 code implementation • 17 Dec 2022 • Zixia Jia, Zhaohui Yan, Wenjuan Han, Zilong Zheng, Kewei Tu
Prior works on joint Information Extraction (IE) typically model instance (e. g., event triggers, entities, roles, relations) interactions by representation enhancement, type dependencies scoring, or global decoding.
1 code implementation • NAACL 2022 • Xinyu Wang, Min Gui, Yong Jiang, Zixia Jia, Nguyen Bach, Tao Wang, Zhongqiang Huang, Fei Huang, Kewei Tu
As text representations take the most important role in MNER, in this paper, we propose {\bf I}mage-{\bf t}ext {\bf A}lignments (ITA) to align image features into the textual space, so that the attention mechanism in transformer-based pretrained textual embeddings can be better utilized.
Ranked #1 on Multi-modal Named Entity Recognition on Twitter-17
Multi-modal Named Entity Recognition named-entity-recognition +1
no code implementations • ACL (IWPT) 2021 • Xinyu Wang, Zixia Jia, Yong Jiang, Kewei Tu
This paper describes the system used in submission from SHANGHAITECH team to the IWPT 2021 Shared Task.
1 code implementation • ACL 2021 • Xinyu Wang, Yong Jiang, Zhaohui Yan, Zixia Jia, Nguyen Bach, Tao Wang, Zhongqiang Huang, Fei Huang, Kewei Tu
The objective function of knowledge distillation is typically the cross-entropy between the teacher and the student's output distributions.
1 code implementation • ACL 2020 • Zixia Jia, Youmi Ma, Jiong Cai, Kewei Tu
Semantic dependency parsing, which aims to find rich bi-lexical relationships, allows words to have multiple dependency heads, resulting in graph-structured representations.
1 code implementation • CONLL 2019 • Xinyu Wang, Yixian Liu, Zixia Jia, Chengyue Jiang, Kewei Tu
This paper presents the system used in our submission to the \textit{CoNLL 2019 shared task: Cross-Framework Meaning Representation Parsing}.