Search Results for author: Zerui Cai

Found 1 papers, 1 papers with code

SMedBERT: A Knowledge-Enhanced Pre-trained Language Model with Structured Semantics for Medical Text Mining

2 code implementations ACL 2021 Taolin Zhang, Zerui Cai, Chengyu Wang, Minghui Qiu, Bite Yang, Xiaofeng He

Recently, the performance of Pre-trained Language Models (PLMs) has been significantly improved by injecting knowledge facts to enhance their abilities of language understanding.

Language Modelling Natural Language Inference +1

Cannot find the paper you are looking for? You can Submit a new open access paper.