Search Results for author: Ralf Möller

Found 19 papers, 3 papers with code

Towards Privacy-Preserving Relational Data Synthesis via Probabilistic Relational Models

no code implementations6 Sep 2024 Malte Luttermann, Ralf Möller, Mattis Hartwig

Probabilistic relational models provide a well-established formalism to combine first-order logic and probabilistic models, thereby allowing to represent relationships between objects in a relational domain.

Variables are a Curse in Software Vulnerability Prediction

no code implementations18 Jun 2024 Jinghua Groppe, Sven Groppe, Ralf Möller

Deep learning-based approaches for software vulnerability prediction currently mainly rely on the original text of software code as the feature of nodes in the graph of code and thus could learn a representation that is only specific to the code text, rather than the representation that depicts the 'intrinsic' functionality of a program hidden in the text representation.

Lifting Factor Graphs with Some Unknown Factors

1 code implementation3 Jun 2024 Malte Luttermann, Ralf Möller, Marcel Gehrke

Lifting exploits symmetries in probabilistic graphical models by using a representative for indistinguishable objects, allowing to carry out query answering more efficiently while maintaining exact answers.

Lifted Causal Inference in Relational Domains

1 code implementation15 Mar 2024 Malte Luttermann, Mattis Hartwig, Tanya Braun, Ralf Möller, Marcel Gehrke

Lifted inference exploits symmetries in probabilistic graphical models by using a representative for indistinguishable objects, thereby speeding up query answering while maintaining exact answers.

Causal Inference

Colour Passing Revisited: Lifted Model Construction with Commutative Factors

1 code implementation20 Sep 2023 Malte Luttermann, Tanya Braun, Ralf Möller, Marcel Gehrke

Lifted probabilistic inference exploits symmetries in a probabilistic model to allow for tractable probabilistic inference with respect to domain sizes.

Derivation of Learning Rules for Coupled Principal Component Analysis in a Lagrange-Newton Framework

no code implementations28 Apr 2022 Ralf Möller

We describe a Lagrange-Newton framework for the derivation of learning rules with desirable convergence properties and apply it to the case of principal component analysis (PCA).

Lifting DecPOMDPs for Nanoscale Systems -- A Work in Progress

no code implementations18 Oct 2021 Tanya Braun, Stefan Fischer, Florian Lau, Ralf Möller

DNA-based nanonetworks have a wide range of promising use cases, especially in the field of medicine.

Improved Convergence Speed of Fully Symmetric Learning Rules for Principal Component Analysis

no code implementations18 Jul 2020 Ralf Möller

Fully symmetric learning rules for principal component analysis can be derived from a novel objective function suggested in our previous work.

Derivation of Symmetric PCA Learning Rules from a Novel Objective Function

no code implementations24 May 2020 Ralf Möller

However, for a subspace with multiple axes, the optimization leads to PSA learning rules which only converge to axes spanning the principal subspace but not to the principal eigenvectors.

Derivation of Coupled PCA and SVD Learning Rules from a Newton Zero-Finding Framework

no code implementations25 Mar 2020 Ralf Möller

A method to derive coupled learning rules from information criteria by Newton optimization is known.

Exploring Unknown Universes in Probabilistic Relational Models

no code implementations7 Jan 2020 Tanya Braun, Ralf Möller

Large probabilistic models are often shaped by a pool of known individuals (a universe) and relations between them.

Taming Reasoning in Temporal Probabilistic Relational Models

no code implementations16 Nov 2019 Marcel Gehrke, Ralf Möller, Tanya Braun

Evidence often grounds temporal probabilistic relational models over time, which makes reasoning infeasible.

Clustering

Answering Hindsight Queries with Lifted Dynamic Junction Trees

no code implementations2 Jul 2018 Marcel Gehrke, Tanya Braun, Ralf Möller

The lifted dynamic junction tree algorithm (LDJT) efficiently answers filtering and prediction queries for probabilistic relational temporal models by building and then reusing a first-order cluster representation of a knowledge base for multiple queries and time steps.

Fusing First-order Knowledge Compilation and the Lifted Junction Tree Algorithm

no code implementations2 Jul 2018 Tanya Braun, Ralf Möller

Standard approaches for inference in probabilistic formalisms with first-order constructs include lifted variable elimination (LVE) for single queries as well as first-order knowledge compilation (FOKC) based on weighted model counting.

Preventing Unnecessary Groundings in the Lifted Dynamic Junction Tree Algorithm

no code implementations2 Jul 2018 Marcel Gehrke, Tanya Braun, Ralf Möller

The lifted dynamic junction tree algorithm (LDJT) efficiently answers filtering and prediction queries for probabilistic relational temporal models by building and then reusing a first-order cluster representation of a knowledge base for multiple queries and time steps.

Towards Analytics Aware Ontology Based Access to Static and Streaming Data (Extended Version)

no code implementations18 Jul 2016 Evgeny Kharlamov, Yannis Kotidis, Theofilos Mailis, Christian Neuenstadt, Charalampos Nikolaou, Özgür Özcep, Christoforos Svingos, Dmitriy Zheleznyakov, Sebastian Brandt, Ian Horrocks, Yannis Ioannidis, Steffen Lamparter, Ralf Möller

Real-time analytics that requires integration and aggregation of heterogeneous and distributed streaming and static data is a typical task in many industrial scenarios such as diagnostics of turbines in Siemens.

Cannot find the paper you are looking for? You can Submit a new open access paper.