Paper

Neural Networks Enhancement with Logical Knowledge

In the recent past, there has been a growing interest in Neural-Symbolic Integration frameworks, i.e., hybrid systems that integrate connectionist and symbolic approaches to obtain the best of both worlds. In a previous work, we proposed KENN (Knowledge Enhanced Neural Networks), a Neural-Symbolic architecture that injects prior logical knowledge into a neural network by adding a new final layer which modifies the initial predictions accordingly to the knowledge. Among the advantages of this strategy, there is the inclusion of clause weights, learnable parameters that represent the strength of the clauses, meaning that the model can learn the impact of each clause on the final predictions. As a special case, if the training data contradicts a constraint, KENN learns to ignore it, making the system robust to the presence of wrong knowledge. In this paper, we propose an extension of KENN for relational data. To evaluate this new extension, we tested it with different learning configurations on Citeseer, a standard dataset for Collective Classification. The results show that KENN is capable of increasing the performances of the underlying neural network even in the presence relational data, outperforming other two notable methods that combine learning with logic.

Results in Papers With Code
(↓ scroll down to see all results)