Knowledge Base Q&A is the task of answering questions from a knowledge base.
|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
Commonsense reasoning aims to empower machines with the human ability to make presumptions about ordinary situations in our daily life.
SOTA for Common Sense Reasoning on CommonsenseQA (using extra training data)
When answering natural language questions over knowledge bases (KBs), different question components and KB aspects play different roles.
We investigate entity linking in the context of a question answering task and present a jointly optimized neural architecture for entity mention detection and entity disambiguation that models the surrounding context on different levels of granularity.
SOTA for Entity Linking on WebQSP-WD
However, one critical problem is that current approaches only get high accuracy for questions whose relations have been seen in the training data.
In this paper, we propose a novel knowledge-aware dialogue generation model (called TransDG), which transfers question representation and knowledge matching abilities from knowledge base question answering (KBQA) task to facilitate the utterance understanding and factual knowledge selection for dialogue generation.
Second, these two tasks can benefit each other: answer selection can incorporate the external knowledge from knowledge base (KB), while KBQA can be improved by learning contextual information from answer selection.