POS

210 papers with code • 2 benchmarks • 1 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find POS models and implementations
3 papers
1,837
2 papers
40

Datasets


Most implemented papers

End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF

guillaumegenthial/sequence_tagging ACL 2016

State-of-the-art sequence labeling systems traditionally require large amounts of task-specific knowledge in the form of hand-crafted features and data pre-processing.

Bidirectional LSTM-CRF Models for Sequence Tagging

determined22/zh-ner-tf 9 Aug 2015

It can also use sentence level tag information thanks to a CRF layer.

Optimal Hyperparameters for Deep LSTM-Networks for Sequence Labeling Tasks

UKPLab/emnlp2017-bilstm-cnn-crf 21 Jul 2017

Selecting optimal parameters for a neural network architecture can often make the difference between mediocre and state-of-the-art performance.

Joint entity recognition and relation extraction as a multi-head selection problem

bekou/multihead_joint_entity_relation_extraction 20 Apr 2018

State-of-the-art models for joint entity recognition and relation extraction strongly rely on external natural language processing (NLP) tools such as POS (part-of-speech) taggers and dependency parsers.

Semantic Relation Classification via Bidirectional LSTM Networks with Entity-aware Attention using Latent Entity Typing

roomylee/entity-aware-relation-classification 23 Jan 2019

Our model not only utilizes entities and their latent types as features effectively but also is more interpretable by visualizing attention mechanisms applied to our model and results of LET.

Automated Phrase Mining from Massive Text Corpora

shangjingbo1226/AutoPhrase 15 Feb 2017

As one of the fundamental tasks in text analysis, phrase mining aims at extracting quality phrases from a text corpus.

Transfer Learning for Sequence Tagging with Hierarchical Recurrent Networks

kimiyoung/transfer 18 Mar 2017

Recent papers have shown that neural networks obtain state-of-the-art performance on several different sequence tagging tasks.

Part-of-Speech Tagging with Bidirectional Long Short-Term Memory Recurrent Neural Network

aneesh-joshi/LSTM_POS_Tagger 21 Oct 2015

Bidirectional Long Short-Term Memory Recurrent Neural Network (BLSTM-RNN) has been shown to be very effective for tagging sequential data, e. g. speech utterances or handwritten documents.

Multilingual Part-of-Speech Tagging with Bidirectional Long Short-Term Memory Models and Auxiliary Loss

bplank/bilstm-aux ACL 2016

Bidirectional long short-term memory (bi-LSTM) networks have recently proven successful for various NLP sequence modeling tasks, but little is known about their reliance to input representations, target languages, data set size, and label noise.

Semi-supervised Multitask Learning for Sequence Labeling

marekrei/sequence-labeler ACL 2017

We propose a sequence labeling framework with a secondary training objective, learning to predict surrounding words for every word in the dataset.