Semantic Role Labeling
132 papers with code • 7 benchmarks • 14 datasets
Semantic role labeling aims to model the predicate-argument structure of a sentence and is often described as answering "Who did what to whom". BIO notation is typically used for semantic role labeling.
Example:
Housing | starts | are | expected | to | quicken | a | bit | from | August’s | pace |
---|---|---|---|---|---|---|---|---|---|---|
B-ARG1 | I-ARG1 | O | O | O | V | B-ARG2 | I-ARG2 | B-ARG3 | I-ARG3 | I-ARG3 |
Datasets
Latest papers with no code
Heterogeneous Line Graph Transformer for Math Word Problems
We originally planned to employ existing models but realized that they processed a math word problem as a sequence or a homogeneous graph of tokens.
Fast and Accurate Span-based Semantic Role Labeling as Graph Parsing
Currently, BIO-based and Tuple-based approaches perform quite well on the span-based semantic role labeling (SRL) task.
An MRC Framework for Semantic Role Labeling
In this way, we are able to leverage both the predicate semantics and the semantic role semantics for argument labeling.
Toward Automatic Misinformation Detection Utilizing Fact-checked Information
The goal was to fact-check a sentence utilizing verified claims stored in the database.
To Augment or Not to Augment? A Comparative Study on Text Augmentation Techniques for Low-Resource NLP
Although NLP has recently witnessed a load of textual augmentation techniques, the field still lacks a systematic performance analysis on a diverse set of languages and sequence tagging tasks.
Remove Noise and Keep Truth: A Noisy Channel Model for Semantic Role Labeling
Semantic role labeling usually models structures using sequences, trees, or graphs.
Semantic Role Labeling as Dependency Parsing: Exploring Latent Tree Structures Inside Arguments
Semantic role labeling (SRL) is a fundamental yet challenging task in the NLP community.
Zero-shot Cross-lingual Conversational Semantic Role Labeling
While conversational semantic role labeling (CSRL) has shown its usefulness on Chinese conversational tasks, it is still under-explored in non-Chinese languages due to the lack of multilingual CSRL annotations for the parser training.
AMRize, then Parse! Enhancing AMR Parsing with PseudoAMR Data
As Abstract Meaning Representation (AMR) implicitly involves compound semantic annotations, we hypothesize auxiliary tasks which are semantically or formally related can better enhance AMR parsing.
Learning Disentangled Representations in Natural Language Definitions with Semantic Role Labeling Supervision
Disentangling the encodings of neural models is a fundamental aspect for improving interpretability, semantic control and downstream task performance in Natural Language Processing.