Coreference Resolution

147 papers with code • 9 benchmarks • 35 datasets

Coreference resolution is the task of clustering mentions in text that refer to the same underlying real world entities.

Example:

               +-----------+
               |           |
I voted for Obama because he was most aligned with my values", she said.
 |                                                 |            |
 +-------------------------------------------------+------------+

"I", "my", and "she" belong to the same cluster and "Obama" and "he" belong to the same cluster.

Greatest papers with code

DeBERTa: Decoding-enhanced BERT with Disentangled Attention

huggingface/transformers ICLR 2021

Recent progress in pre-trained neural language models has significantly improved the performance of many natural language processing (NLP) tasks.

Common Sense Reasoning Coreference Resolution +11

Deep contextualized word representations

flairNLP/flair NAACL 2018

We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e. g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i. e., to model polysemy).

Ranked #2 on Citation Intent Classification on ACL-ARC (using extra training data)

Citation Intent Classification Conversational Response Selection +7

Language Models are Few-Shot Learners

openai/gpt-3 NeurIPS 2020

By contrast, humans can generally perform a new language task from only a few examples or from simple instructions - something which current NLP systems still largely struggle to do.

Common Sense Reasoning Coreference Resolution +10

Stanza: A Python Natural Language Processing Toolkit for Many Human Languages

stanfordnlp/stanza ACL 2020

We introduce Stanza, an open-source Python natural language processing toolkit supporting 66 human languages.

Coreference Resolution Dependency Parsing +4

Higher-order Coreference Resolution with Coarse-to-fine Inference

kentonl/e2e-coref NAACL 2018

We introduce a fully differentiable approximation to higher-order inference for coreference resolution.

Coreference Resolution

End-to-end Neural Coreference Resolution

kentonl/e2e-coref EMNLP 2017

We introduce the first end-to-end coreference resolution model and show that it significantly outperforms all previous work without using a syntactic parser or hand-engineered mention detector.

Coreference Resolution

BERT for Coreference Resolution: Baselines and Analysis

mandarjoshi90/coref IJCNLP 2019

We apply BERT to coreference resolution, achieving strong improvements on the OntoNotes (+3. 9 F1) and GAP (+11. 5 F1) benchmarks.

Coreference Resolution Document-level