Continuous and Interactive Factual Knowledge Learning in Verification Dialogues

NeurIPS Workshop HAMLETS 2020  ·  Anonymous ·

Knowledge bases (KBs) used in applications such as dialogue systems need to be continuously expanded in order to serve the users well. This process is known as knowledge base completion (KBC). A piece of knowledge or a fact is often represented as a triple (s, r, t), meaning that the entity s and the entity t have the relation r or are linked by r. KBC builds a model to infer missing facts from the existing ones in a given KB. Existing KBC research typically makes the closed-world assumption that to infer a new fact (s, r, t), it assumes that s, r and t are already in the KB, but are not linked. Clearly, this assumption is a serious limitation. In this paper, we eliminate this assumption and allow s, r and/or t to be unknown to the KB, which we call open-world knowledge base completion (OKBC). We focus on solving OKBC via user interactions, which enables the proposed system to potentially serve as an engine for learning new knowledge during dialogue. Experimental results show the effectiveness of the proposed approach.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here