POS Tagging

125 papers with code • 0 benchmarks • 0 datasets

Part of Speech Tagging


Use these libraries to find POS Tagging models and implementations
2 papers

Most implemented papers

End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF

guillaumegenthial/sequence_tagging ACL 2016

State-of-the-art sequence labeling systems traditionally require large amounts of task-specific knowledge in the form of hand-crafted features and data pre-processing.

Does Manipulating Tokenization Aid Cross-Lingual Transfer? A Study on POS Tagging for Non-Standardized Languages

mainlp/noisydialect 20 Apr 2023

This can for instance be observed when finetuning PLMs on one language and evaluating them on data in a closely related language variety with no standardized orthography.

Part-of-Speech Tagging with Bidirectional Long Short-Term Memory Recurrent Neural Network

aneesh-joshi/LSTM_POS_Tagger 21 Oct 2015

Bidirectional Long Short-Term Memory Recurrent Neural Network (BLSTM-RNN) has been shown to be very effective for tagging sequential data, e. g. speech utterances or handwritten documents.

Transfer Learning for Sequence Tagging with Hierarchical Recurrent Networks

kimiyoung/transfer 18 Mar 2017

Recent papers have shown that neural networks obtain state-of-the-art performance on several different sequence tagging tasks.

PKUSEG: A Toolkit for Multi-Domain Chinese Word Segmentation

lancopku/pkuseg-python 27 Jun 2019

Through this method, we generate synthetic data using a large amount of unlabeled data in the target domain and then obtain a word segmentation model for the target domain.

Multilingual Part-of-Speech Tagging with Bidirectional Long Short-Term Memory Models and Auxiliary Loss

bplank/bilstm-aux ACL 2016

Bidirectional long short-term memory (bi-LSTM) networks have recently proven successful for various NLP sequence modeling tasks, but little is known about their reliance to input representations, target languages, data set size, and label noise.

Semi-supervised Multitask Learning for Sequence Labeling

marekrei/sequence-labeler ACL 2017

We propose a sequence labeling framework with a secondary training objective, learning to predict surrounding words for every word in the dataset.

Portuguese Word Embeddings: Evaluating on Word Analogies and Natural Language Tasks

nathanshartmann/portuguese_word_embeddings WS 2017

Word embeddings have been found to provide meaningful representations for words in an efficient way; therefore, they have become common in Natural Language Processing sys- tems.

LINSPECTOR: Multilingual Probing Tasks for Word Representations

UKPLab/linspector CL 2020

We present a reusable methodology for creation and evaluation of such tests in a multilingual setting.

Segmental Recurrent Neural Networks

ykrmm/TREMBA 18 Nov 2015

Representations of the input segments (i. e., contiguous subsequences of the input) are computed by encoding their constituent tokens using bidirectional recurrent neural nets, and these "segment embeddings" are used to define compatibility scores with output labels.