UER: An Open-Source Toolkit for Pre-training Models

IJCNLP 2019 Zhe ZhaoHui ChenJinbin ZhangXin ZhaoTao LiuWei LuXi ChenHaotang DengQi JuXiaoyong Du

Existing works, including ELMO and BERT, have revealed the importance of pre-training for NLP tasks. While there does not exist a single pre-training model that works best in all cases, it is of necessity to develop a framework that is able to deploy various pre-training models efficiently... (read more)

PDF Abstract

Evaluation Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.