no code implementations • 18 Jul 2016 • Evgeny Kharlamov, Yannis Kotidis, Theofilos Mailis, Christian Neuenstadt, Charalampos Nikolaou, Özgür Özcep, Christoforos Svingos, Dmitriy Zheleznyakov, Sebastian Brandt, Ian Horrocks, Yannis Ioannidis, Steffen Lamparter, Ralf Möller
Real-time analytics that requires integration and aggregation of heterogeneous and distributed streaming and static data is a typical task in many industrial scenarios such as diagnostics of turbines in Siemens.
no code implementations • 1 Nov 2016 • Wolfram Schenck, Hendrik Hasenbein, Ralf Möller
This model is based on the internal simulation of movement sequences.
no code implementations • 2 Jul 2018 • Marcel Gehrke, Tanya Braun, Ralf Möller
The lifted dynamic junction tree algorithm (LDJT) efficiently answers filtering and prediction queries for probabilistic relational temporal models by building and then reusing a first-order cluster representation of a knowledge base for multiple queries and time steps.
no code implementations • 2 Jul 2018 • Tanya Braun, Ralf Möller
Standard approaches for inference in probabilistic formalisms with first-order constructs include lifted variable elimination (LVE) for single queries as well as first-order knowledge compilation (FOKC) based on weighted model counting.
no code implementations • 2 Jul 2018 • Marcel Gehrke, Tanya Braun, Ralf Möller
The lifted dynamic junction tree algorithm (LDJT) efficiently answers filtering and prediction queries for probabilistic relational temporal models by building and then reusing a first-order cluster representation of a knowledge base for multiple queries and time steps.
no code implementations • 16 Nov 2019 • Marcel Gehrke, Ralf Möller, Tanya Braun
Evidence often grounds temporal probabilistic relational models over time, which makes reasoning infeasible.
no code implementations • 7 Jan 2020 • Tanya Braun, Ralf Möller
Large probabilistic models are often shaped by a pool of known individuals (a universe) and relations between them.
no code implementations • 25 Mar 2020 • Ralf Möller
A method to derive coupled learning rules from information criteria by Newton optimization is known.
no code implementations • 24 May 2020 • Ralf Möller
However, for a subspace with multiple axes, the optimization leads to PSA learning rules which only converge to axes spanning the principal subspace but not to the principal eigenvectors.
no code implementations • 18 Jul 2020 • Ralf Möller
Fully symmetric learning rules for principal component analysis can be derived from a novel objective function suggested in our previous work.
no code implementations • 18 Oct 2021 • Tanya Braun, Stefan Fischer, Florian Lau, Ralf Möller
DNA-based nanonetworks have a wide range of promising use cases, especially in the field of medicine.
no code implementations • 28 Apr 2022 • Ralf Möller
We describe a Lagrange-Newton framework for the derivation of learning rules with desirable convergence properties and apply it to the case of principal component analysis (PCA).
1 code implementation • 20 Sep 2023 • Malte Luttermann, Tanya Braun, Ralf Möller, Marcel Gehrke
Lifted probabilistic inference exploits symmetries in a probabilistic model to allow for tractable probabilistic inference with respect to domain sizes.
1 code implementation • 15 Mar 2024 • Malte Luttermann, Mattis Hartwig, Tanya Braun, Ralf Möller, Marcel Gehrke
Lifted inference exploits symmetries in probabilistic graphical models by using a representative for indistinguishable objects, thereby speeding up query answering while maintaining exact answers.