Search Results for author: Jian Gang Ngui

Found 1 papers, 0 papers with code

Do pretrained transformers infer telicity like humans?

no code implementations CoNLL (EMNLP) 2021 Yiyun Zhao, Jian Gang Ngui, Lucy Hall Hartley, Steven Bethard

Pretrained transformer-based language models achieve state-of-the-art performance in many NLP tasks, but it is an open question whether the knowledge acquired by the models during pretraining resembles the linguistic knowledge of humans.

Cannot find the paper you are looking for? You can Submit a new open access paper.