POS
253 papers with code • 4 benchmarks • 4 datasets
Libraries
Use these libraries to find POS models and implementationsLatest papers with no code
Proof-of-Learning with Incentive Security
Most concurrent blockchain systems rely heavily on the Proof-of-Work (PoW) or Proof-of-Stake (PoS) mechanisms for decentralized consensus and security assurance.
Transformers as Transducers
We study the sequence-to-sequence mapping capacity of transformers by relating them to finite transducers, and find that they can express surprisingly large classes of transductions.
Exploring language relations through syntactic distances and geographic proximity
Languages are grouped into families that share common linguistic traits.
Leveraging Linguistically Enhanced Embeddings for Open Information Extraction
To bridge this gap, we are the first to leverage linguistic features with a Seq2Seq PLM for OIE.
PostoMETRO: Pose Token Enhanced Mesh Transformer for Robust 3D Human Mesh Recovery
With the recent advancements in single-image-based human mesh recovery, there is a growing interest in enhancing its performance in certain extreme scenarios, such as occlusion, while maintaining overall model accuracy.
MaiBaam: A Multi-Dialectal Bavarian Universal Dependency Treebank
Despite the success of the Universal Dependencies (UD) project exemplified by its impressive language breadth, there is still a lack in `within-language breadth': most treebanks focus on standard languages.
MRL Parsing Without Tears: The Case of Hebrew
Syntactic parsing remains a critical tool for relation extraction and information extraction, especially in resource-scarce languages where LLMs are lacking.
MaiBaam Annotation Guidelines
This document provides the annotation guidelines for MaiBaam, a Bavarian corpus annotated with part-of-speech (POS) tags and syntactic dependencies.
Whose LLM is it Anyway? Linguistic Comparison and LLM Attribution for GPT-3.5, GPT-4 and Bard
Large Language Models (LLMs) are capable of generating text that is similar to or surpasses human quality.
An Effective Incorporating Heterogeneous Knowledge Curriculum Learning for Sequence Labeling
To address this challenge, we propose a two-stage curriculum learning (TCL) framework specifically designed for sequence labeling tasks.