Chunking

118 papers with code • 5 benchmarks • 5 datasets

Chunking, also known as shallow parsing, identifies continuous spans of tokens that form syntactic units such as noun phrases or verb phrases.

Example:

Vinken , 61 years old
B-NLP I-NP I-NP I-NP I-NP

Libraries

Use these libraries to find Chunking models and implementations
3 papers
1,896
2 papers
14,207
2 papers
334

Most implemented papers

Bidirectional LSTM-CRF Models for Sequence Tagging

determined22/zh-ner-tf 9 Aug 2015

It can also use sentence level tag information thanks to a CRF layer.

Optimal Hyperparameters for Deep LSTM-Networks for Sequence Labeling Tasks

UKPLab/emnlp2017-bilstm-cnn-crf 21 Jul 2017

Selecting optimal parameters for a neural network architecture can often make the difference between mediocre and state-of-the-art performance.

Semi-supervised Multitask Learning for Sequence Labeling

marekrei/sequence-labeler ACL 2017

We propose a sequence labeling framework with a secondary training objective, learning to predict surrounding words for every word in the dataset.

NCRF++: An Open-source Neural Sequence Labeling Toolkit

jiesutd/NCRFpp ACL 2018

This paper describes NCRF++, a toolkit for neural sequence labeling.

Improving Named Entity Recognition by External Context Retrieving and Cooperative Learning

modelscope/adaseq ACL 2021

We find empirically that the contextual representations computed on the retrieval-based input view, constructed through the concatenation of a sentence and its external contexts, can achieve significantly improved performance compared to the original input view based only on the sentence.

Natural Language Processing (almost) from Scratch

faramarzmunshi/d2l-nlp 2 Mar 2011

We propose a unified neural network architecture and learning algorithm that can be applied to various natural language processing tasks including: part-of-speech tagging, chunking, named entity recognition, and semantic role labeling.

A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks

hassyGo/charNgram2vec EMNLP 2017

Transfer and multi-task learning have traditionally focused on either a single source-target pair or very few, similar tasks.

Design Challenges and Misconceptions in Neural Sequence Labeling

jiesutd/NCRFpp COLING 2018

We investigate the design challenges of constructing effective and efficient neural sequence labeling systems, by reproducing twelve neural sequence labeling models, which include most of the state-of-the-art structures, and conduct a systematic model comparison on three benchmarks (i. e. NER, Chunking, and POS tagging).

Capturing Global Informativeness in Open Domain Keyphrase Extraction

thunlp/BERT-KPE 28 Apr 2020

Open-domain KeyPhrase Extraction (KPE) aims to extract keyphrases from documents without domain or quality restrictions, e. g., web pages with variant domains and qualities.

Automated Concatenation of Embeddings for Structured Prediction

Alibaba-NLP/ACE ACL 2021

Pretrained contextualized embeddings are powerful word representations for structured prediction tasks.