Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling

EMNLP 2018 Liyuan LiuXiang RenJingbo ShangJian PengJiawei Han

Many efforts have been made to facilitate natural language processing tasks with pre-trained language models (LMs), and brought significant improvements to various applications. To fully leverage the nearly unlimited corpora and capture linguistic information of multifarious levels, large-size LMs are required; but for a specific task, only parts of these information are useful... (read more)

PDF Abstract

Evaluation results from the paper


Task Dataset Model Metric name Metric value Global rank Compare
Named Entity Recognition (NER) CoNLL 2003 (English) LD-Net F1 92.03 # 8