Search Results for author: Johannes Müller

Found 20 papers, 5 papers with code

Fisher-Rao Gradient Flows of Linear Programs and State-Action Natural Policy Gradients

no code implementations28 Mar 2024 Johannes Müller, Semih Çaycı, Guido Montúfar

Kakade's natural policy gradient method has been studied extensively in the last years showing linear convergence with and without regularization.

Achieving High Accuracy with PINNs via Energy Natural Gradients

1 code implementation25 Feb 2023 Johannes Müller, Marius Zeinhofer

We propose energy natural gradient descent, a natural gradient method with respect to a Hessian-induced Riemannian metric as an optimization algorithm for physics-informed neural networks (PINNs) and the deep Ritz method.

Vocal Bursts Intensity Prediction

Age structure, replicator equation, and the prisoner's dilemma

no code implementations19 Dec 2022 Sona John, Johannes Müller

These time scales, which seem to form a universal structure in the interplay of weak selection and life-history traits, allow us to reduce the infinite dimensional model to a one-dimensional modified replicator equation.

Geometry and convergence of natural policy gradient methods

no code implementations3 Nov 2022 Johannes Müller, Guido Montúfar

We study the convergence of several natural policy gradient (NPG) methods in infinite-horizon discounted Markov decision processes with regular policy parametrizations.

Policy Gradient Methods

Extrinsic Camera Calibration with Semantic Segmentation

1 code implementation8 Aug 2022 Alexander Tsaregorodtsev, Johannes Müller, Jan Strohbeck, Martin Herrmann, Michael Buchholz, Vasileios Belagiannis

Our approach relies on a coarse initial measurement of the camera pose and builds on lidar sensors mounted on a vehicle with high-precision localization to capture a point cloud of the camera environment.

Camera Calibration Segmentation +1

Invariance Properties of the Natural Gradient in Overparametrised Systems

no code implementations30 Jun 2022 Jesse van Oostrum, Johannes Müller, Nihat Ay

The natural gradient field is a vector field that lives on a model equipped with a distinguished Riemannian metric, e. g. the Fisher-Rao metric, and represents the direction of steepest ascent of an objective function on the model with respect to this metric.

Self-Assessment for Single-Object Tracking in Clutter Using Subjective Logic

no code implementations15 Jun 2022 Thomas Griebel, Johannes Müller, Paul Geisler, Charlotte Hermann, Martin Herrmann, Michael Buchholz, Klaus Dietmayer

Therefore, this work presents a novel method for self-assessment of single-object tracking in clutter based on Kalman filtering and subjective logic.

Decision Making Object Tracking

Solving infinite-horizon POMDPs with memoryless stochastic policies in state-action space

1 code implementation27 May 2022 Johannes Müller, Guido Montúfar

Reward optimization in fully observable Markov decision processes is equivalent to a linear program over the polytope of state-action frequencies.

Deep Reinforcement Learning for Data-Driven Adaptive Scanning in Ptychography

no code implementations29 Mar 2022 Marcel Schloz, Johannes Müller, Thomas C. Pekin, Wouter Van den Broek, Christoph T. Koch

We present a method that lowers the dose required for a ptychographic reconstruction by adaptively scanning the specimen, thereby providing the required spatial information redundancy in the regions of highest importance.

reinforcement-learning Reinforcement Learning (RL)

Life-History traits and the replicator equation

no code implementations13 Nov 2021 Johannes Müller, Aurélien Tellier

In this context, it is fundamentally of interest to generalize the replicator equation, which is at the heart of most population genomics models.

The Geometry of Memoryless Stochastic Policy Optimization in Infinite-Horizon POMDPs

2 code implementations ICLR 2022 Johannes Müller, Guido Montúfar

We then describe the optimization problem as a linear optimization problem in the space of feasible state-action frequencies subject to polynomial constraints that we characterize explicitly.

Error Estimates for the Deep Ritz Method with Boundary Penalty

no code implementations1 Mar 2021 Johannes Müller, Marius Zeinhofer

Our results apply to arbitrary sets of ansatz functions and estimate the error in dependence of the optimization accuracy, the approximation capabilities of the ansatz class and -- in the case of Dirichlet boundary values -- the penalization strength $\lambda$.

Contact Tracing & Super-Spreaders in the Branching-Process Model

no code implementations10 Oct 2020 Johannes Müller, Volker Hösel

We investigate a novel model for super-spreader events, not based on a heterogeneous contact graph but on a random contact rate: Many individuals become infected synchronously in single contact events.

Kalman Filter Meets Subjective Logic: A Self-Assessing Kalman Filter Using Subjective Logic

no code implementations1 Jul 2020 Thomas Griebel, Johannes Müller, Michael Buchholz, Klaus Dietmayer

Thus, by embedding classical Kalman filtering into subjective logic, our method additionally features an explicit measure for statistical uncertainty in the self-assessment.

Deep Ritz revisited

no code implementations ICLR Workshop DeepDiffEq 2019 Johannes Müller, Marius Zeinhofer

In this notes we use the notion of $\Gamma$-convergence to show that ReLU networks of growing architecture that are trained with respect to suitably regularised Dirichlet energies converge to the true solution of the Poisson problem.

LACI: Low-effort Automatic Calibration of Infrastructure Sensors

no code implementations5 Nov 2019 Johannes Müller, Martin Herrmann, Jan Strohbeck, Vasileios Belagiannis, Michael Buchholz

While classical approaches are sensor-specific and often need calibration targets as well as a widely overlapping field of view (FOV), within this work, a cooperative intelligent vehicle is used as callibration target.

On the space-time expressivity of ResNets

no code implementations ICLR Workshop DeepDiffEq 2019 Johannes Müller

This structure can be seen as the Euler discretisation of an associated ordinary differential equation (ODE) which is called a neural ODE.

Multi-scale Convolutional Neural Networks for Inverse Problems

2 code implementations29 Oct 2018 Feng Wang, Alberto Eljarrat, Johannes Müller, Trond Henninen, Erni Rolf, Christoph Koch

We propose a novel neural network architecture highlighting fast convergence as a generic solution addressing image(s)-to-image(s) inverse problems of different domains.

Computational Physics Materials Science

Speeding up SOR Solvers for Constraint-based GUIs with a Warm-Start Strategy

no code implementations6 Jan 2014 Noreen Jamil, Johannes Müller, Christof Lutteroth, Gerald Weber

Constraints are a powerful tool for specifying adaptable GUI layouts: they are used to specify a layout in a general form, and a constraint solver is used to find a satisfying concrete layout, e. g.\ for a specific GUI size.

Cannot find the paper you are looking for? You can Submit a new open access paper.