no code implementations • 28 Jan 2020 • Shoubin Li, Wenzao Cui, Yujiang Liu, Xuran Ming, Jun Hu, YuanzheHu, Qing Wang
Pre-trained models such as BERT are widely used in NLP tasks and are fine-tuned to improve the performance of various NLP tasks consistently.