Search Results

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

PaddlePaddle/PaddleNLP NAACL 2019

We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers.

Citation Intent Classification Common Sense Reasoning +10

Cannot find the paper you are looking for? You can Submit a new open access paper.