Contextual String Embeddings for Sequence Labeling

Recent advances in language modeling using recurrent neural networks have made it viable to model language as distributions over characters. By learning to predict the next character on the basis of previous characters, such models have been shown to automatically internalize linguistic concepts such as words, sentences, subclauses and even sentiment... (read more)

PDF Abstract COLING 2018 PDF COLING 2018 Abstract
TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK USES EXTRA
TRAINING DATA
BENCHMARK
Named Entity Recognition CoNLL++ Flair embeddings F1 93.89 # 3
Chunking CoNLL 2000 Flair Exact Span F1 96.72 # 4
Named Entity Recognition CoNLL 2003 (English) Flair embeddings F1 93.09 # 14
Named Entity Recognition CoNLL 2003 (German) Revised Flair F1 88.3 # 4
Named Entity Recognition Long-tail emerging entities Flair embeddings F1 50.20 # 1
Named Entity Recognition Ontonotes v5 (English) Flair embeddings F1 89.3 # 8
Part-Of-Speech Tagging Penn Treebank Flair embeddings Accuracy 97.85 # 2
Chunking Penn Treebank Flair embeddings F1 score 96.72 # 2

Methods used in the Paper


METHOD TYPE
🤖 No Methods Found Help the community by adding them if they're not listed; e.g. Deep Residual Learning for Image Recognition uses ResNet