Search Results for author: Yongyu Yan

Found 1 papers, 1 papers with code

AF Adapter: Continual Pretraining for Building Chinese Biomedical Language Model

1 code implementation21 Nov 2022 Yongyu Yan, Kui Xue, Xiaoming Shi, Qi Ye, Jingping Liu, Tong Ruan

Continual pretraining is a popular way of building a domain-specific pretrained language model from a general-domain language model.

Continual Pretraining Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.