1 code implementation • 26 Jun 2023 • Samy Badreddine, Luciano Serafini, Michael Spranger
A significant trend in the literature involves integrating axioms and facts in loss functions by grounding logical symbols with neural networks and operators with fuzzy semantics.
no code implementations • 21 Apr 2023 • Donghee Choi, Mogan Gim, Samy Badreddine, Hajung Kim, Donghyeon Park, Jaewoo Kang
We introduce KitchenScale, a fine-tuned Pre-trained Language Model (PLM) that predicts a target ingredient's quantity and measurement unit given its recipe context.
1 code implementation • 31 Mar 2023 • Samy Badreddine, Gianluca Apriceno, Andrea Passerini, Luciano Serafini
In this paper, we introduce Interval Real Logic (IRL), a two-sorted logic that interprets knowledge such as sequential properties (traces) and event properties using sequences of real-featured data.
1 code implementation • 25 Dec 2020 • Samy Badreddine, Artur d'Avila Garcez, Luciano Serafini, Michael Spranger
In this paper, we present Logic Tensor Networks (LTN), a neurosymbolic formalism and computational model that supports learning and reasoning through the introduction of a many-valued, end-to-end differentiable first-order logic called Real Logic as a representation language for deep learning.
no code implementations • 15 Jun 2019 • Samy Badreddine, Michael Spranger
Facts are provided as background knowledge a priori to learning a policy for how to act in the world.