Browse > Natural Language Processing > Natural Language Inference

Natural Language Inference

104 papers with code · Natural Language Processing

Natural language inference is the task of determining whether a "hypothesis" is true (entailment), false (contradiction), or undetermined (neutral) given a "premise".

Example:

Premise Label Hypothesis
A man inspects the uniform of a figure in some East Asian country. contradiction The man is sleeping.
An older and younger man smiling. neutral Two men are smiling and laughing at the cats playing on the floor.
A soccer game with multiple males playing. entailment Some men are playing a sport.

State-of-the-art leaderboards

Greatest papers with code

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

11 Oct 2018google-research/bert

We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language representation models, BERT is designed to pre-train deep bidirectional representations by jointly conditioning on both left and right context in all layers.

COMMON SENSE REASONING CROSS-LINGUAL NATURAL LANGUAGE INFERENCE NAMED ENTITY RECOGNITION QUESTION ANSWERING

Deep contextualized word representations

HLT 2018 zalandoresearch/flair

We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e.g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i.e., to model polysemy). Our word vectors are learned functions of the internal states of a deep bidirectional language model (biLM), which is pre-trained on a large text corpus.

COREFERENCE RESOLUTION LANGUAGE MODELLING NAMED ENTITY RECOGNITION NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SEMANTIC ROLE LABELING SENTIMENT ANALYSIS

A Structured Self-attentive Sentence Embedding

9 Mar 2017jadore801120/attention-is-all-you-need-pytorch

This paper proposes a new model for extracting an interpretable sentence embedding by introducing self-attention. Instead of using a vector, we use a 2-D matrix to represent the embedding, with each row of the matrix attending on a different part of the sentence.

NATURAL LANGUAGE INFERENCE SENTENCE EMBEDDING SENTIMENT ANALYSIS

The Natural Language Decathlon: Multitask Learning as Question Answering

ICLR 2019 salesforce/decaNLP

Furthermore, we present a new Multitask Question Answering Network (MQAN) jointly learns all tasks in decaNLP without any task-specific modules or parameters in the multitask setting. Though designed for decaNLP, MQAN also achieves state of the art results on the WikiSQL semantic parsing task in the single-task setting.

DOMAIN ADAPTATION MACHINE TRANSLATION NAMED ENTITY RECOGNITION NATURAL LANGUAGE INFERENCE QUESTION ANSWERING RELATION EXTRACTION SEMANTIC PARSING SEMANTIC ROLE LABELING SENTIMENT ANALYSIS TEXT CLASSIFICATION TRANSFER LEARNING

Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual Transfer and Beyond

26 Dec 2018facebookresearch/LASER

We introduce an architecture to learn joint multilingual sentence representations for 93 languages, belonging to more than 30 different language families and written in 28 different scripts. Finally, we introduce a new test set of aligned sentences in 122 languages based on the Tatoeba corpus, and show that our sentence embeddings obtain strong results in multilingual similarity search even for low-resource languages.

CROSS-LINGUAL BITEXT MINING CROSS-LINGUAL DOCUMENT CLASSIFICATION CROSS-LINGUAL NATURAL LANGUAGE INFERENCE CROSS-LINGUAL TRANSFER DOCUMENT CLASSIFICATION JOINT MULTILINGUAL SENTENCE REPRESENTATIONS PARALLEL CORPUS MINING

Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning

ICLR 2018 facebookresearch/InferSent

A lot of the recent success in natural language processing (NLP) has been driven by distributed vector representations of words trained on large amounts of text in an unsupervised manner. In this work, we present a simple, effective multi-task learning framework for sentence representations that combines the inductive biases of diverse training objectives in a single model.

MULTI-TASK LEARNING NATURAL LANGUAGE INFERENCE PARAPHRASE IDENTIFICATION SEMANTIC TEXTUAL SIMILARITY

SentEval: An Evaluation Toolkit for Universal Sentence Representations

LREC 2018 facebookresearch/InferSent

We introduce SentEval, a toolkit for evaluating the quality of universal sentence representations. SentEval encompasses a variety of tasks, including binary and multi-class classification, natural language inference and sentence similarity.

NATURAL LANGUAGE INFERENCE

Supervised Learning of Universal Sentence Representations from Natural Language Inference Data

EMNLP 2017 facebookresearch/InferSent

Many modern NLP systems rely on word embeddings, previously trained in an unsupervised manner on large corpora, as base features. Efforts to obtain embeddings for larger chunks of text, such as sentences, have however not been so successful.

CROSS-LINGUAL NATURAL LANGUAGE INFERENCE SEMANTIC TEXTUAL SIMILARITY TRANSFER LEARNING WORD EMBEDDINGS

Improving Language Understanding by Generative Pre-Training

Preprint 2018 openai/finetune-transformer-lm

We demonstrate that large gains on these tasks can be realized by generative pre-training of a language model on a diverse corpus of unlabeled text, followed by discriminative fine-tuning on each specific task. We demonstrate the effectiveness of our approach on a wide range of benchmarks for natural language understanding.

DOCUMENT CLASSIFICATION LANGUAGE MODELLING NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SEMANTIC TEXTUAL SIMILARITY

Machine Comprehension Using Match-LSTM and Answer Pointer

29 Aug 2016baidu/DuReader

Machine comprehension of text is an important problem in natural language processing. We propose two ways of using Pointer Net for our task.

NATURAL LANGUAGE INFERENCE QUESTION ANSWERING READING COMPREHENSION