Chunking
68 papers with code • 5 benchmarks • 5 datasets
Chunking, also known as shallow parsing, identifies continuous spans of tokens that form syntactic units such as noun phrases or verb phrases.
Example:
Vinken | , | 61 | years | old |
---|---|---|---|---|
B-NLP | I-NP | I-NP | I-NP | I-NP |
Libraries
Use these libraries to find Chunking models and implementationsMost implemented papers
Contextual String Embeddings for Sequence Labeling
Recent advances in language modeling using recurrent neural networks have made it viable to model language as distributions over characters.
Sequence Labeling: A Practical Approach
We take a practical approach to solving sequence labeling problem assuming unavailability of domain expertise and scarcity of informational and computational resources.
Large scale visual place recognition with sub-linear storage growth
Robotic and animal mapping systems share many of the same objectives and challenges, but differ in one key aspect: where much of the research in robotic mapping has focused on solving the data association problem, the grid cell neurons underlying maps in the mammalian brain appear to intentionally break data association by encoding many locations with a single grid cell neuron.
FLAIR: An Easy-to-Use Framework for State-of-the-Art NLP
We present FLAIR, an NLP framework designed to facilitate training and distribution of state-of-the-art sequence labeling, text classification and language models.
GCDT: A Global Context Enhanced Deep Transition Architecture for Sequence Labeling
Current state-of-the-art systems for sequence labeling are typically based on the family of Recurrent Neural Networks (RNNs).
Augmenting Neural Networks with First-order Logic
Today, the dominant paradigm for training neural networks involves minimizing task loss on a large dataset.
Language-Agnostic Syllabification with Neural Sequence Labeling
The concept of the syllable is cross-linguistic, though formal definitions are rarely agreed upon, even within a language.
Gated Task Interaction Framework for Multi-task Sequence Tagging
Others have shown that linguistic features can improve the performance of neural models on tasks such as chunking and named entity recognition (NER).
CmnRec: Sequential Recommendations with Chunk-accelerated Memory Network
Specifically, our framework divides proximal information units into chunks, and performs memory access at certain time steps, whereby the number of memory operations can be greatly reduced.