Three types of independencies are important to represent and exploit for scalable inference in hybrid models: conditional independencies elegantly modeled in Bayesian networks, context-specific independencies naturally represented by logical rules, and independencies amongst attributes of related objects in relational models succinctly expressed by combining rules.
We demonstrate a deep learning framework which is inherently based in the highly expressive language of relational logic, enabling to, among other things, capture arbitrarily complex graph structures.
The computation graphs themselves then reflect the symmetries of the underlying data, similarly to the lifted graphical models.
We demonstrate a declarative differentiable programming framework based on the language of Lifted Relational Neural Networks, where small parameterized logic programs are used to encode relational learning scenarios.
It is known due to the work of Van den Broeck et al [KR, 2014] that weighted first-order model counting (WFOMC) in the two-variable fragment of first-order logic can be solved in time polynomial in the number of domain elements.
In this paper we show that inference in 2-variable Markov logic networks (MLNs) with cardinality and function constraints is domain-liftable.
We study expressivity of Markov logic networks (MLNs).
We study the symmetric weighted first-order model counting task and present ApproxWFOMC, a novel anytime method for efficiently bounding the weighted first-order model count in the presence of an unweighted first-order model counting oracle.
We study computational aspects of relational marginal polytopes which are statistical relational learning counterparts of marginal polytopes, well-known from probabilistic graphical models.
We study theoretical properties of embedding methods for knowledge graph completion under the missing completely at random assumption.
In this paper, we present SafeLearner -- a scalable solution to probabilistic KB completion that performs probabilistic rule learning using lifted probabilistic inference -- as faster approach instead of grounding.
Markov Logic Networks (MLNs) are well-suited for expressing statistics such as "with high probability a smoker knows another smoker" but not for expressing statements such as "there is a smoker who knows most other smokers", which is necessary for modeling, e. g. influencers in social networks.
In many applications of relational learning, the available data can be seen as a sample from a larger relational structure (e. g. we may be given a small fragment from some social network).
Lifted Relational Neural Networks (LRNNs) describe relational domains using weighted first-order rules which act as templates for constructing feed-forward neural networks.
Compared to Markov Logic Networks (MLNs), our method is faster and produces considerably more interpretable models.