Contextual String Embeddings for Sequence Labeling

COLING 2018 Alan AkbikDuncan BlytheRol Vollgraf

Recent advances in language modeling using recurrent neural networks have made it viable to model language as distributions over characters. By learning to predict the next character on the basis of previous characters, such models have been shown to automatically internalize linguistic concepts such as words, sentences, subclauses and even sentiment... (read more)

PDF Abstract

Evaluation results from the paper


Task Dataset Model Metric name Metric value Global rank Uses extra
training data
Compare
Named Entity Recognition CoNLL 2003 (English) Flair embeddings F1 93.09 # 5
Named Entity Recognition Long-tail emerging entities Flair embeddings F1 50.20 # 1
Named Entity Recognition Ontonotes v5 (English) Flair embeddings F1 89.3 # 3
Part-Of-Speech Tagging Penn Treebank Flair embeddings Accuracy 97.85 # 2
Chunking Penn Treebank Flair embeddings F1 score 96.72 # 1