no code implementations • SemEval (NAACL) 2022 • Qizhi Lin, Changyu Hou, Xiaopeng Wang, Jun Wang, Yixuan Qiao, Peng Jiang, Xiandi Jiang, Benqi Wang, Qifeng Xiao
From pretrained contextual embedding to document-level embedding, the selection and construction of embedding have drawn more and more attention in the NER domain in recent research.
no code implementations • 21 Oct 2022 • Jun Wang, Weixun Li, Changyu Hou, Xin Tang, Yixuan Qiao, Rui Fang, Pengyong Li, Peng Gao, Guotong Xie
Contrastive learning has emerged as a powerful tool for graph representation learning.
no code implementations • SemEval (NAACL) 2022 • Changyu Hou, Jun Wang, Yixuan Qiao, Peng Jiang, Peng Gao, Guotong Xie, Qizhi Lin, Xiaopeng Wang, Xiandi Jiang, Benqi Wang, Qifeng Xiao
By assigning different weights to each model for different inputs, we adopted the Transformer layer to integrate the advantages of diverse models effectively.
Low Resource Named Entity Recognition named-entity-recognition +2