no code implementations • 19 Aug 2019 • Zhi-Xiu Ye, Qian Chen, Wen Wang, Zhen-Hua Ling
We also observe that fine-tuned models after the proposed pre-training approach maintain comparable performance on other NLP tasks, such as sentence classification and natural language inference tasks, compared to the original BERT models.
Ranked #26 on Common Sense Reasoning on CommonsenseQA
1 code implementation • ACL 2019 • Zhi-Xiu Ye, Zhen-Hua Ling
This paper presents a multi-level matching and aggregation network (MLMAN) for few-shot relation classification.
1 code implementation • NAACL 2019 • Zhi-Xiu Ye, Zhen-Hua Ling
This paper presents a neural relation extraction method to deal with the noisy training data generated by distant supervision.
1 code implementation • ACL 2018 • Zhi-Xiu Ye, Zhen-Hua Ling
This paper proposes hybrid semi-Markov conditional random fields (SCRFs) for neural sequence labeling in natural language processing.
Ranked #61 on Named Entity Recognition (NER) on CoNLL 2003 (English)