Search Results for author: Rasmus Kær Jørgensen

Found 1 papers, 1 papers with code

MDAPT: Multilingual Domain Adaptive Pretraining in a Single Model

1 code implementation Findings (EMNLP) 2021 Rasmus Kær Jørgensen, Mareike Hartmann, Xiang Dai, Desmond Elliott

Domain adaptive pretraining, i. e. the continued unsupervised pretraining of a language model on domain-specific text, improves the modelling of text for downstream tasks within the domain.

Language Modelling named-entity-recognition +4

Cannot find the paper you are looking for? You can Submit a new open access paper.