Coreference Resolution
263 papers with code • 16 benchmarks • 43 datasets
Coreference resolution is the task of clustering mentions in text that refer to the same underlying real world entities.
Example:
+-----------+
| |
I voted for Obama because he was most aligned with my values", she said.
| | |
+-------------------------------------------------+------------+
"I", "my", and "she" belong to the same cluster and "Obama" and "he" belong to the same cluster.
Libraries
Use these libraries to find Coreference Resolution models and implementationsDatasets
Latest papers
HuixiangDou-CR: Coreference Resolution in Group Chats
How to eliminate pronominal reference in group chats?
Transforming Dutch: Debiasing Dutch Coreference Resolution Systems for Non-binary Pronouns
We further show that CDA remains effective in low-resource settings, in which a limited set of debiasing documents is used.
Asking and Answering Questions to Extract Event-Argument Structures
Transformer-based questions are generated using large language models trained to formulate questions based on a passage and the expected answer.
REXEL: An End-to-end Model for Document-Level Relation Extraction and Entity Linking
Extracting structured information from unstructured text is critical for many downstream NLP applications and is traditionally achieved by closed information extraction (cIE).
Multimodal Cross-Document Event Coreference Resolution Using Linear Semantic Transfer and Mixed-Modality Ensembles
We establish three methods that incorporate images and text for coreference: 1) a standard fused model with finetuning, 2) a novel linear mapping method without finetuning and 3) an ensembling approach based on splitting mention pairs by semantic and discourse-level difficulty.
Okay, Let's Do This! Modeling Event Coreference with Generated Rationales and Knowledge Distillation
In NLP, Event Coreference Resolution (ECR) is the task of connecting event clusters that refer to the same underlying real-life event, usually via neural systems.
A Rationale-centric Counterfactual Data Augmentation Method for Cross-Document Event Coreference Resolution
Based on Pre-trained Language Models (PLMs), event coreference resolution (ECR) systems have demonstrated outstanding performance in clustering coreferential events across documents.
A Controlled Reevaluation of Coreference Resolution Models
When controlling for language model size, encoder-based CR models outperform more recent decoder-based models in terms of both accuracy and inference speed.
Linear Cross-document Event Coreference Resolution with X-AMR
We then linearize the ECR with a novel multi-hop coreference algorithm over the event graphs.
SPLICE: A Singleton-Enhanced PipeLIne for Coreference REsolution
We then propose a two-step neural mention and coreference resolution system, named SPLICE, and compare its performance to the end-to-end approach in two scenarios: the OntoNotes test set and the out-of-domain (OOD) OntoGUM corpus.