Browse > Reasoning > Common Sense Reasoning

Common Sense Reasoning

25 papers with code · Reasoning

Common sense reasoning tasks are intended to require the model to go beyond pattern recognition. Instead, the model should use "common sense" or world knowledge to make inferences.

State-of-the-art leaderboards

Latest papers with code

Language Models are Unsupervised Multitask Learners

Preprint 2019 openai/gpt-2

Natural language processing tasks, such as question answering, machine translation, reading comprehension, and summarization, are typically approached with supervised learning on taskspecific datasets. We demonstrate that language models begin to learn these tasks without any explicit supervision when trained on a new dataset of millions of webpages called WebText.

COMMON SENSE REASONING DOCUMENT SUMMARIZATION LANGUAGE MODELLING MACHINE TRANSLATION QUESTION ANSWERING READING COMPREHENSION

Compositional Language Understanding with Text-based Relational Reasoning

7 Nov 2018koustuvsinha/clutrr

Neural networks for natural language reasoning have largely focused on extractive, fact-based question-answering (QA) and common-sense inference. However, it is also crucial to understand the extent to which neural networks can perform relational reasoning and combinatorial generalization from natural language---abilities that are often obscured by annotation artifacts and the dominance of language modeling in standard QA benchmarks.

COMMON SENSE REASONING LANGUAGE MODELLING QUESTION ANSWERING RELATIONAL REASONING

07 Nov 2018

pair2vec: Compositional Word-Pair Embeddings for Cross-Sentence Inference

20 Oct 2018mandarjoshi90/pair2vec

Reasoning about implied relationships (e.g. paraphrastic, common sense, encyclopedic) between pairs of words is crucial for many cross-sentence inference problems. This paper proposes new methods for learning and using embeddings of word pairs that implicitly represent background knowledge about such relationships.

COMMON SENSE REASONING WORD EMBEDDINGS

20 Oct 2018

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

11 Oct 2018rpuiggari/bert2

We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language representation models, BERT is designed to pre-train deep bidirectional representations by jointly conditioning on both left and right context in all layers.

COMMON SENSE REASONING CROSS-LINGUAL NATURAL LANGUAGE INFERENCE NAMED ENTITY RECOGNITION QUESTION ANSWERING

11 Oct 2018

SWAG: A Large-Scale Adversarial Dataset for Grounded Commonsense Inference

EMNLP 2018 chiahsuan156/Question-Answering_Resources_Papers

Given a partial description like "she opened the hood of the car," humans can reason about the situation and anticipate what might come next ("then, she examined the engine"). In this paper, we introduce the task of grounded commonsense inference, unifying natural language inference and commonsense reasoning.

COMMON SENSE REASONING NATURAL LANGUAGE INFERENCE

16 Aug 2018

Incorporating Chinese Characters of Words for Lexical Sememe Prediction

ACL 2018 thunlp/Character-enhanced-Sememe-Prediction

However, existing methods of lexical sememe prediction typically rely on the external context of words to represent the meaning, which usually fails to deal with low-frequency and out-of-vocabulary words. To address this issue for Chinese, we propose a novel framework to take advantage of both internal character information and external context information of words.

COMMON SENSE REASONING

17 Jun 2018

A Simple Method for Commonsense Reasoning

7 Jun 2018tensorflow/models

For example, it is difficult to use neural networks to tackle the Winograd Schema dataset~\cite{levesque2011winograd}. In this paper, we present a simple method for commonsense reasoning with neural networks, using unsupervised learning.

COMMON SENSE REASONING

Empirical Analysis of Foundational Distinctions in Linked Open Data

26 Mar 2018fdistinctions/ijcai18

For example, distinctions such as whether an entity is inherently a class or an individual, or whether it is a physical object or not, are hardly expressed in the data, although they have been largely studied and formalised by foundational ontologies (e.g. DOLCE, SUMO). We want to answer questions such as "does the DBpedia entity for dog refer to a class or to an instance?".

COMMON SENSE REASONING SCENE RECOGNITION

26 Mar 2018

Relational Neural Expectation Maximization: Unsupervised Discovery of Objects and their Interactions

ICLR 2018 sjoerdvansteenkiste/Relational-NEM

Common-sense physical reasoning is an essential ingredient for any intelligent agent operating in the real-world. For example, it can be used to simulate the environment, or to infer the state of parts of the world that are currently unobserved.

COMMON SENSE REASONING

28 Feb 2018

DKN: Deep Knowledge-Aware Network for News Recommendation

25 Jan 2018hwwang55/DKN

To solve the above problems, in this paper, we propose a deep knowledge-aware network (DKN) that incorporates knowledge graph representation into news recommendation. We also validate the efficacy of the usage of knowledge in DKN.

CLICK-THROUGH RATE PREDICTION COMMON SENSE REASONING RECOMMENDATION SYSTEMS

25 Jan 2018