Search Results for author: Ma Yukun

Found 1 papers, 0 papers with code

Improving BERT with Hybrid Pooling Network and Drop Mask

no code implementations14 Jul 2023 Qian Chen, Wen Wang, Qinglin Zhang, Chong Deng, Ma Yukun, Siqi Zheng

Transformer-based pre-trained language models, such as BERT, achieve great success in various natural language understanding tasks.

Language Modelling Masked Language Modeling +2

Cannot find the paper you are looking for? You can Submit a new open access paper.