Semantics-aware BERT for Language Understanding

5 Sep 2019Zhuosheng ZhangYuwei WuHai ZhaoZuchao LiShuailiang ZhangXi ZhouXiang Zhou

The latest work on language representations carefully integrates contextualized features into language model training, which enables a series of success especially in various machine reading comprehension and natural language inference tasks. However, the existing language representation models including ELMo, GPT and BERT only exploit plain context-sensitive features such as character or word embeddings... (read more)

PDF Abstract

Evaluation Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK COMPARE
Natural Language Inference SNLI SemBERT % Test Accuracy 91.9 # 1
Natural Language Inference SNLI SemBERT % Train Accuracy 94.4 # 13
Natural Language Inference SNLI SemBERT Parameters 339m # 1
Question Answering SQuAD2.0 SemBERT (ensemble) EM 86.166 # 42
Question Answering SQuAD2.0 SemBERT (ensemble) F1 88.886 # 47
Question Answering SQuAD2.0 SemBERT(ensemble) EM 86.166 # 42
Question Answering SQuAD2.0 SemBERT(ensemble) F1 88.886 # 47
Question Answering SQuAD2.0 SemBERT (single model) EM 84.800 # 59
Question Answering SQuAD2.0 SemBERT (single model) F1 87.864 # 59
Question Answering SQuAD2.0 dev SemBERT large F1 83.6 # 7
Question Answering SQuAD2.0 dev SemBERT large EM 80.9 # 5