no code implementations • 14 Jul 2023 • Qian Chen, Wen Wang, Qinglin Zhang, Chong Deng, Ma Yukun, Siqi Zheng
Transformer-based pre-trained language models, such as BERT, achieve great success in various natural language understanding tasks.
Language Modelling Masked Language Modeling +2