no code implementations • CoNLL (EMNLP) 2021 • Yiyun Zhao, Jian Gang Ngui, Lucy Hall Hartley, Steven Bethard
Pretrained transformer-based language models achieve state-of-the-art performance in many NLP tasks, but it is an open question whether the knowledge acquired by the models during pretraining resembles the linguistic knowledge of humans.