Coreference Resolution

253 papers with code • 15 benchmarks • 42 datasets

Coreference resolution is the task of clustering mentions in text that refer to the same underlying real world entities.

Example:

               +-----------+
               |           |
I voted for Obama because he was most aligned with my values", she said.
 |                                                 |            |
 +-------------------------------------------------+------------+

"I", "my", and "she" belong to the same cluster and "Obama" and "he" belong to the same cluster.

Most implemented papers

Attention Is All You Need

tensorflow/tensor2tensor NeurIPS 2017

The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration.

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

google-research/bert NAACL 2019

We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers.

Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

huggingface/transformers arXiv 2019

Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP).

Deep contextualized word representations

flairNLP/flair NAACL 2018

We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e. g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i. e., to model polysemy).

Language Models are Few-Shot Learners

openai/gpt-3 NeurIPS 2020

By contrast, humans can generally perform a new language task from only a few examples or from simple instructions - something which current NLP systems still largely struggle to do.

Language Models are Unsupervised Multitask Learners

openai/gpt-2 Preprint 2019

Natural language processing tasks, such as question answering, machine translation, reading comprehension, and summarization, are typically approached with supervised learning on taskspecific datasets.

DeBERTa: Decoding-enhanced BERT with Disentangled Attention

microsoft/DeBERTa ICLR 2021

Recent progress in pre-trained neural language models has significantly improved the performance of many natural language processing (NLP) tasks.

SpanBERT: Improving Pre-training by Representing and Predicting Spans

facebookresearch/SpanBERT TACL 2020

We present SpanBERT, a pre-training method that is designed to better represent and predict spans of text.

Higher-order Coreference Resolution with Coarse-to-fine Inference

kentonl/e2e-coref NAACL 2018

We introduce a fully differentiable approximation to higher-order inference for coreference resolution.

Multi-Task Identification of Entities, Relations, and Coreference for Scientific Knowledge Graph Construction

luanyi/DyGIE EMNLP 2018

We introduce a multi-task setup of identifying and classifying entities, relations, and coreference clusters in scientific articles.