Chunking, also known as shallow parsing, identifies continuous spans of tokens that form syntactic units such as noun phrases or verb phrases.
|Trend||Dataset||Best Method||Paper title||Paper||Code||Compare|
Recent advances in language modeling using recurrent neural networks have made it viable to model language as distributions over characters.
SOTA for Chunking on Penn Treebank
We investigate the design challenges of constructing effective and efficient neural sequence labeling systems, by reproducing twelve neural sequence labeling models, which include most of the state-of-the-art structures, and conduct a systematic model comparison on three benchmarks (i. e. NER, Chunking, and POS tagging).
Selecting optimal parameters for a neural network architecture can often make the difference between mediocre and state-of-the-art performance.
This paper proposes hybrid semi-Markov conditional random fields (SCRFs) for neural sequence labeling in natural language processing.
#23 best model for Named Entity Recognition on CoNLL 2003 (English)
We propose a sequence labeling framework with a secondary training objective, learning to predict surrounding words for every word in the dataset.
#3 best model for Grammatical Error Detection on FCE
It can also use sentence level tag information thanks to a CRF layer.
The results show that the proposed method achieves as good as or better results compared to the other word embeddings in the tasks we investigate.