Search Results for author: Fengcheng Yuan

Found 1 papers, 1 papers with code

ROSITA: Refined BERT cOmpreSsion with InTegrAted techniques

1 code implementation21 Mar 2021 Yuanxin Liu, Zheng Lin, Fengcheng Yuan

Based on the empirical findings, our best compressed model, dubbed Refined BERT cOmpreSsion with InTegrAted techniques (ROSITA), is $7. 5 \times$ smaller than BERT while maintains $98. 5\%$ of the performance on five tasks of the GLUE benchmark, outperforming the previous BERT compression methods with similar parameter budget.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.