Chunking
58 papers with code • 5 benchmarks • 4 datasets
Chunking, also known as shallow parsing, identifies continuous spans of tokens that form syntactic units such as noun phrases or verb phrases.
Example:
Vinken | , | 61 | years | old |
---|---|---|---|---|
B-NLP | I-NP | I-NP | I-NP | I-NP |
Libraries
Use these libraries to find Chunking models and implementationsMost implemented papers
Bidirectional LSTM-CRF Models for Sequence Tagging
It can also use sentence level tag information thanks to a CRF layer.
Optimal Hyperparameters for Deep LSTM-Networks for Sequence Labeling Tasks
Selecting optimal parameters for a neural network architecture can often make the difference between mediocre and state-of-the-art performance.
Semi-supervised Multitask Learning for Sequence Labeling
We propose a sequence labeling framework with a secondary training objective, learning to predict surrounding words for every word in the dataset.
NCRF++: An Open-source Neural Sequence Labeling Toolkit
This paper describes NCRF++, a toolkit for neural sequence labeling.
Improving Named Entity Recognition by External Context Retrieving and Cooperative Learning
We find empirically that the contextual representations computed on the retrieval-based input view, constructed through the concatenation of a sentence and its external contexts, can achieve significantly improved performance compared to the original input view based only on the sentence.
Natural Language Processing (almost) from Scratch
We propose a unified neural network architecture and learning algorithm that can be applied to various natural language processing tasks including: part-of-speech tagging, chunking, named entity recognition, and semantic role labeling.
A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks
Transfer and multi-task learning have traditionally focused on either a single source-target pair or very few, similar tasks.
Semi-supervised sequence tagging with bidirectional language models
Pre-trained word embeddings learned from unlabeled text have become a standard component of neural network architectures for NLP tasks.
Design Challenges and Misconceptions in Neural Sequence Labeling
We investigate the design challenges of constructing effective and efficient neural sequence labeling systems, by reproducing twelve neural sequence labeling models, which include most of the state-of-the-art structures, and conduct a systematic model comparison on three benchmarks (i. e. NER, Chunking, and POS tagging).
Capturing Global Informativeness in Open Domain Keyphrase Extraction
Open-domain KeyPhrase Extraction (KPE) aims to extract keyphrases from documents without domain or quality restrictions, e. g., web pages with variant domains and qualities.