Browse > Natural Language Processing > Relation Extraction

Relation Extraction

58 papers with code · Natural Language Processing

State-of-the-art leaderboards

No evaluation results yet. Help compare methods by submit evaluation metrics.

Greatest papers with code

The Natural Language Decathlon: Multitask Learning as Question Answering

ICLR 2019 salesforce/decaNLP

Furthermore, we present a new Multitask Question Answering Network (MQAN) jointly learns all tasks in decaNLP without any task-specific modules or parameters in the multitask setting. Though designed for decaNLP, MQAN also achieves state of the art results on the WikiSQL semantic parsing task in the single-task setting.

DOMAIN ADAPTATION MACHINE TRANSLATION NAMED ENTITY RECOGNITION NATURAL LANGUAGE INFERENCE QUESTION ANSWERING RELATION EXTRACTION SEMANTIC PARSING SEMANTIC ROLE LABELING SENTIMENT ANALYSIS TEXT CLASSIFICATION TRANSFER LEARNING

A Hierarchical Multi-task Approach for Learning Embeddings from Semantic Tasks

14 Nov 2018huggingface/hmtl

Much effort has been devoted to evaluate whether multi-task learning can be leveraged to learn rich representations that can be used in various Natural Language Processing (NLP) down-stream applications. The model is trained in a hierarchical fashion to introduce an inductive bias by supervising a set of low level tasks at the bottom layers of the model and more complex tasks at the top layers of the model.

MULTI-TASK LEARNING NAMED ENTITY RECOGNITION RELATION EXTRACTION

Indirect Supervision for Relation Extraction using Question-Answer Pairs

30 Oct 2017shanzhenren/CoType

However, due to the incompleteness of knowledge bases and the context-agnostic labeling, the training data collected via distant supervision (DS) can be very noisy. In this paper, we propose a novel framework, ReQuest, to leverage question-answer pairs as an indirect source of supervision for relation extraction, and study how to use such supervision to reduce noise induced from DS.

QUESTION ANSWERING RELATION EXTRACTION

CoType: Joint Extraction of Typed Entities and Relations with Knowledge Bases

27 Oct 2016shanzhenren/CoType

We propose a novel domain-independent framework, called CoType, that runs a data-driven text segmentation algorithm to extract entity mentions, and jointly embeds entity mentions, relation mentions, text features and type labels into two low-dimensional spaces (for entity and relation mentions respectively), where, in each space, objects whose types are close will also have similar representations. We formulate a joint optimization problem to learn embeddings from text corpora and knowledge bases, adopting a novel partial-label loss function for noisy labeled data and introducing an object "translation" function to capture the cross-constraints of entities and relations on each other.

RELATION EXTRACTION

BioBERT: a pre-trained biomedical language representation model for biomedical text mining

25 Jan 2019dmis-lab/biobert

Biomedical text mining is becoming increasingly important as the number of biomedical documents rapidly grows. However, as deep learning models require a large amount of training data, applying deep learning to biomedical text mining is often unsuccessful due to the lack of training data in biomedical fields.

NAMED ENTITY RECOGNITION QUESTION ANSWERING RELATION EXTRACTION

A Discourse-Level Named Entity Recognition and Relation Extraction Dataset for Chinese Literature Text

19 Nov 2017lancopku/Chinese-Literature-NER-RE-Dataset

To build a high quality dataset, we propose two tagging methods to solve the problem of data inconsistency, including a heuristic tagging method and a machine auxiliary tagging method. Based on this corpus, we also introduce several widely used models to conduct experiments.

NAMED ENTITY RECOGNITION RELATION EXTRACTION

Learning to Compose Domain-Specific Transformations for Data Augmentation

NeurIPS 2017 HazyResearch/tanda

Data augmentation is a ubiquitous technique for increasing the size of labeled training sets by leveraging task-specific data transformations that preserve class labels. While it is often easy for domain experts to specify individual transformations, constructing and tuning the more sophisticated compositions typically needed to achieve state-of-the-art results is a time-consuming manual task in practice.

RELATION EXTRACTION

Training Classifiers with Natural Language Explanations

ACL 2018 HazyResearch/babble

Training accurate classifiers requires many labels, but each label provides only limited information (one bit for binary classification). In this work, we propose BabbleLabble, a framework for training classifiers in which an annotator provides a natural language explanation for each labeling decision.

RELATION EXTRACTION

Simultaneously Self-Attending to All Mentions for Full-Abstract Biological Relation Extraction

HLT 2018 patverga/bran

Most work in relation extraction forms a prediction by looking at a short span of text within a single sentence containing a single entity pair mention. This approach often does not consider interactions across mentions, requires redundant computation for each mention pair, and ignores relationships expressed across sentence boundaries.

RELATION EXTRACTION