We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers.
Ranked #1 on
Citation Intent Classification
Common Sense Reasoning
COLING (TextGraphs) 2020
To address this problem, we use a pre-trained language model to recall the top-K relevant explanations for each question.