DL2: Training and Querying Neural Networks with Logic

ICLR 2019 Marc FischerMislav BalunovicDana Drachsler-CohenTimon GehrCe ZhangMartin Vechev

We present DL2, a system for training and querying neural networks with logical constraints. The key idea is to translate these constraints into a differentiable loss with desirable mathematical properties and to then either train with this loss in an iterative manner or to use the loss for querying the network for inputs subject to the constraints... (read more)

PDF Abstract


No code implementations yet. Submit your code now


Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.