Maintaining prediction quality under the condition of a growing knowledge space

3 Aug 2015  ·  Christoph Jahnz ·

Intelligence can be understood as an agent's ability to predict its environment's dynamic by a level of precision which allows it to effectively foresee opportunities and threats. Under the assumption that such intelligence relies on a knowledge space any effective reasoning would benefit from a maximum portion of useful and a minimum portion of misleading knowledge fragments. It begs the question of how the quality of such knowledge space can be kept high as the amount of knowledge keeps growing. This article proposes a mathematical model to describe general principles of how quality of a growing knowledge space evolves depending on error rate, error propagation and countermeasures. There is also shown to which extend the quality of a knowledge space collapses as removal of low quality knowledge fragments occurs too slowly for a given knowledge space's growth rate.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here