Search Results for author: Robert Tinn

Found 6 papers, 2 papers with code

Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing

1 code implementation31 Jul 2020 Yu Gu, Robert Tinn, Hao Cheng, Michael Lucas, Naoto Usuyama, Xiaodong Liu, Tristan Naumann, Jianfeng Gao, Hoifung Poon

In this paper, we challenge this assumption by showing that for domains with abundant unlabeled text, such as biomedicine, pretraining language models from scratch results in substantial gains over continual pretraining of general-domain language models.

Continual Pretraining +11

Cannot find the paper you are looking for? You can Submit a new open access paper.