Semi-supervised sequence tagging with bidirectional language models

ACL 2017 Matthew E. PetersWaleed AmmarChandra BhagavatulaRussell Power

Pre-trained word embeddings learned from unlabeled text have become a standard component of neural network architectures for NLP tasks. However, in most cases, the recurrent network that operates on word-level representations to produce context sensitive representations is trained on relatively little labeled data... (read more)

PDF Abstract

Evaluation results from the paper


Task Dataset Model Metric name Metric value Global rank Compare
Named Entity Recognition (NER) CoNLL 2003 (English) TagLM F1 91.93 # 9