# Logical Reasoning

131 papers with code • 9 benchmarks • 11 datasets

## Libraries

Use these libraries to find Logical Reasoning models and implementations## Datasets

## Subtasks

- Navigate
- Novel Concepts
- Temporal Sequences
- Physical Intuition
- Physical Intuition
- StrategyQA
- Elementary Mathematics
- Date Understanding
- Logic Grid Puzzle
- Epistemic Reasoning
- Logical Fallacy Detection
- Logical Sequence
- Analytic Entailment
- Code Line Descriptions
- Checkmate In One
- Entailed Polarity
- Evaluating Information Essentiality
- Logical Args
- Metaphor Boolean
- Penguins In A Table
- Presuppositions As NLI
- Reasoning About Colored Objects
- College Mathematics

## Most implemented papers

# Beta Embeddings for Multi-Hop Logical Reasoning in Knowledge Graphs

Logical operations are performed in the embedding space by neural operators over the probabilistic embeddings.

# PaLM: Scaling Language Modeling with Pathways

To further our understanding of the impact of scale on few-shot learning, we trained a 540-billion parameter, densely activated, Transformer language model, which we call Pathways Language Model PaLM.

# SATNet: Bridging deep learning and logical reasoning using a differentiable satisfiability solver

We demonstrate that by integrating this solver into end-to-end learning systems, we can learn the logical structure of challenging problems in a minimally supervised fashion.

# Neural Collaborative Reasoning

Existing Collaborative Filtering (CF) methods are mostly designed based on the idea of matching, i. e., by learning user and item embeddings from data using shallow or deep models, they try to capture the associative relevance patterns in data, so that a user embedding can be matched with relevant item embeddings using designed or learned similarity functions.

# Neural Logic Reasoning

Both reasoning and generalization ability are important for prediction tasks such as recommender systems, where reasoning provides strong connection between user history and target items for accurate prediction, and generalization helps the model to draw a robust user portrait over noisy inputs.

# MetaLogic: Logical Reasoning Explanations with Fine-Grained Structure

To this end, we propose a comprehensive logical reasoning explanation form.

# Logic Tensor Networks: Deep Learning and Logical Reasoning from Data and Knowledge

We propose Logic Tensor Networks: a uniform framework for integrating automatic learning and reasoning.

# Chains of Reasoning over Entities, Relations, and Text using Recurrent Neural Networks

Our goal is to combine the rich multistep inference of symbolic logical reasoning with the generalization capabilities of neural networks.

# Ontology Reasoning with Deep Neural Networks

This is an important and at the same time very natural logical reasoning task, which is why the presented approach is applicable to a plethora of important real-world problems.

# MMM: Multi-stage Multi-task Learning for Multi-choice Reading Comprehension

Machine Reading Comprehension (MRC) for question answering (QA), which aims to answer a question given the relevant context passages, is an important way to test the ability of intelligence systems to understand human language.