no code implementations • 28 Nov 2024 • Michael Cummins, Guner Dilsad Er, Michael Muehlebach
We address the problem of client participation in federated learning, where traditional methods typically rely on a random selection of a small subset of clients for each training round.
no code implementations • 2 Oct 2024 • Klaus-Rudolf Kladny, Bernhard Schölkopf, Michael Muehlebach
In this work, we propose Sequential Conformal Prediction for Generative Models (SCOPE-Gen), a sequential conformal prediction method producing prediction sets that satisfy a rigorous statistical guarantee called conformal admissibility control.
1 code implementation • 18 Jul 2024 • Anna M. Wundram, Paul Fischer, Michael Muehlebach, Lisa M. Koch, Christian F. Baumgartner
Our results show that it is possible to achieve the desired coverage with small prediction ranges, highlighting the potential of performance range prediction as a valuable tool for output quality control.
1 code implementation • 11 Jul 2024 • Paul Fischer, Hannah Willms, Moritz Schneider, Daniela Thorwarth, Michael Muehlebach, Christian F. Baumgartner
We evaluate our algorithm on real clinical planing volumes from five different anatomical regions and show that our novel subgroup RCPS (SG-RCPS) algorithm leads to prediction intervals that jointly control the risk for multiple subgroups.
no code implementations • 28 May 2024 • Onno Eberhard, Claire Vernade, Michael Muehlebach
Reinforcement learning has traditionally focused on learning state-dependent policies to solve optimal control problems in a closed-loop fashion.
no code implementations • 17 May 2024 • Guner Dilsad Er, Sebastian Trimpe, Michael Muehlebach
We also characterize the effect of communication drops and demonstrate that our algorithm is robust to communication failures.
no code implementations • 8 Apr 2024 • Hao Ma, Melanie Zeilinger, Michael Muehlebach
We propose a novel gradient-based online optimization framework for solving stochastic programming problems that frequently arise in the context of cyber-physical and robotic systems.
no code implementations • 19 Mar 2024 • Liang Zhang, Niao He, Michael Muehlebach
In this work, we propose a simple primal method, termed Constrained Gradient Method (CGM), for addressing functional constrained variational inequality problems, without necessitating any information on the optimal Lagrange multipliers.
no code implementations • 8 Feb 2024 • Jasan Zughaibi, Bradley J. Nelson, Michael Muehlebach
This greatly expands the range of potential medical applications and includes even dynamic environments as encountered in cardiovascular interventions.
no code implementations • 25 Jan 2024 • Florian Dörfler, Zhiyu He, Giuseppe Belgioioso, Saverio Bolognani, John Lygeros, Michael Muehlebach
Traditionally, numerical algorithms are seen as isolated pieces of code confined to an {\em in silico} existence.
1 code implementation • 11 Oct 2023 • Klaus-Rudolf Kladny, Julius von Kügelgen, Bernhard Schölkopf, Michael Muehlebach
Counterfactuals answer questions of what would have been observed under altered circumstances and can therefore offer valuable insights.
1 code implementation • 9 Jun 2023 • Klaus-Rudolf Kladny, Julius von Kügelgen, Bernhard Schölkopf, Michael Muehlebach
We study causal effect estimation from a mixture of observational and interventional data in a confounded linear regression model with multivariate treatments.
no code implementations • 24 May 2023 • Jan Achterhold, Philip Tobuschat, Hao Ma, Dieter Buechler, Michael Muehlebach, Joerg Stueckler
Our gray-box approach builds on a physical model.
no code implementations • 6 Apr 2023 • Michael Muehlebach
We consider adaptive decision-making problems where an agent optimizes a cumulative performance objective by repeatedly choosing among a finite set of options.
no code implementations • 16 Mar 2023 • Sholom Schechtman, Daniil Tiapkin, Michael Muehlebach, Eric Moulines
We consider the problem of minimizing a non-convex function over a smooth manifold $\mathcal{M}$.
no code implementations • 1 Feb 2023 • Michael Muehlebach, Michael I. Jordan
We exploit analogies between first-order algorithms for constrained optimization and non-smooth dynamical systems to design a new class of accelerated first-order algorithms for constrained optimization.
1 code implementation • 12 Dec 2022 • Daniel Frank, Decky Aspandi Latif, Michael Muehlebach, Benjamin Unger, Steffen Staab
In this work, we represent a recurrent neural network as a linear time-invariant system with nonlinear disturbances.
no code implementations • 7 Jun 2022 • Aniket Das, Bernhard Schölkopf, Michael Muehlebach
We obtain tight convergence rates for RR and SO and demonstrate that these strategies lead to faster convergence than uniform sampling.
no code implementations • 17 Jul 2021 • Michael Muehlebach, Michael I. Jordan
We introduce a class of first-order methods for smooth constrained optimization that are based on an analogy to non-smooth dynamical systems.
no code implementations • 28 Feb 2020 • Michael Muehlebach, Michael. I. Jordan
We analyze the convergence rate of various momentum-based optimization algorithms from a dynamical systems point of view.
no code implementations • ICML 2020 • Michael Muehlebach, Michael. I. Jordan
This article derives lower bounds on the convergence rate of continuous-time gradient-based optimization algorithms.
Optimization and Control Systems and Control Systems and Control
no code implementations • 26 May 2019 • N. Benjamin Erichson, Michael Muehlebach, Michael W. Mahoney
In addition to providing high-profile successes in computer vision and natural language processing, neural networks also provide an emerging set of techniques for scientific problems.
no code implementations • 17 May 2019 • Michael Muehlebach, Michael. I. Jordan
We present a dynamical system framework for understanding Nesterov's accelerated gradient method.