no code implementations • 5 Dec 2019 • Xing Meng, Craig H. Ganoe, Ryan T. Sieberg, Yvonne Y. Cheung, Saeed Hassanpour
We pre-trained the BERT model on a large unlabeled corpus of radiology reports and used the resulting contextual representations in a final text classifier for communication urgency.