1 code implementation • Findings (EMNLP) 2021 • Rasmus Kær Jørgensen, Mareike Hartmann, Xiang Dai, Desmond Elliott
Domain adaptive pretraining, i. e. the continued unsupervised pretraining of a language model on domain-specific text, improves the modelling of text for downstream tasks within the domain.