Search Results for author: Wen Tai

Found 1 papers, 0 papers with code

exBERT: Extending Pre-trained Models with Domain-specific Vocabulary Under Constrained Training Resources

no code implementations Findings of the Association for Computational Linguistics 2020 Wen Tai, H. T. Kung, Xin Dong, Marcus Comiter, Chang-Fu Kuo

We introduce exBERT, a training method to extend BERT pre-trained models from a general domain to a new pre-trained model for a specific domain with a new additive vocabulary under constrained training resources (i. e., constrained computation and data).

Cannot find the paper you are looking for? You can Submit a new open access paper.