ERNIE: Enhanced Language Representation with Informative Entities

ACL 2019 Zhengyan ZhangXu HanZhiyuan LiuXin JiangMaosong SunQun Liu

Neural language representation models such as BERT pre-trained on large-scale corpora can well capture rich semantic patterns from plain text, and be fine-tuned to consistently improve the performance of various NLP tasks. However, the existing pre-trained language models rarely consider incorporating knowledge graphs (KGs), which can provide rich structured knowledge facts for better language understanding... (read more)

PDF Abstract

Evaluation Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK COMPARE
Relation Extraction FewRel ERNIE F1 88.32 # 1
Natural Language Inference MultiNLI ERNIE Matched 84.0 # 12
Natural Language Inference MultiNLI ERNIE Mismatched 83.2 # 11
Entity Typing Open Entity ERNIE F1 75.56 # 1
Sentiment Analysis SST-2 Binary classification ERNIE Accuracy 93.5 # 13
Relation Extraction TACRED ERNIE F1 67.97 # 4