no code implementations • 16 May 2024 • Lucas Böttcher, Ronald Klingebiel
Budgetary constraints force organizations to pursue only a subset of possible innovation projects.
no code implementations • 5 Apr 2024 • Lucas Böttcher, Gregory Wheeler
Artificial neural networks can be seen as high-dimensional mathematical functions, and understanding the geometric properties of their loss landscapes (i. e., the high-dimensional space on which one wishes to find extrema or saddles) can provide valuable insights into their optimization behavior, generalization abilities, and overall performance.
1 code implementation • 18 Mar 2024 • Lucas Böttcher, Luis L. Fonseca, Reinhard C. Laubenbacher
A key technology for this purpose involves medical digital twins, computational models of human biology that can be personalized and dynamically updated to incorporate patient-specific data collected over time.
1 code implementation • 8 Feb 2024 • Luis L. Fonseca, Lucas Böttcher, Borna Mehrad, Reinhard C. Laubenbacher
The vision of personalized medicine is to identify interventions that maintain or restore a person's health based on their individual biology.
1 code implementation • 15 Jul 2023 • Lucas Böttcher
Ensemble Kalman inversion (EKI) is a sequential Monte Carlo method used to solve inverse problems within a Bayesian framework.
1 code implementation • 28 Aug 2022 • Lucas Böttcher, Gregory Wheeler
We show that saddle points in the original space are rarely correctly identified as such in expected lower-dimensional representations if random projections are used.
1 code implementation • 22 Jun 2022 • Lucas Böttcher, Thomas Asikis
Optimal control problems naturally arise in many scientific applications where one wishes to steer a dynamical system from a certain initial state $\mathbf{x}_0$ to a desired target state $\mathbf{x}^*$ in finite time $T$.
1 code implementation • 18 May 2022 • Lucas Böttcher, Sascha Wald, Tom Chou
Complementing the results on simulated repertoires, we derive explicit expressions for the richness and its uncertainty for specific, single-parameter truncated power-law probability distributions.
1 code implementation • 6 Feb 2022 • Mingtao Xia, Lucas Böttcher, Tom Chou
We propose a solution to such problems by combining two classes of numerical methods: (i) adaptive spectral methods and (ii) physics-informed neural networks (PINNs).
1 code implementation • 16 Jan 2022 • Lucas Böttcher, Thomas Asikis, Ioannis Fragkos
To solve such optimization problems, inventory managers need to decide what quantities to order from each supplier, given the net inventory and outstanding orders, so that the expected backlogging, holding, and sourcing costs are jointly minimized.
1 code implementation • 29 Jul 2021 • Mingtao Xia, Lucas Böttcher, Tom Chou
Efficient testing and vaccination protocols are critical aspects of epidemic management.
no code implementations • 11 Mar 2021 • Lucas Böttcher, Nino Antulov-Fantulin, Thomas Asikis
Although optimal control problems of dynamical systems can be formulated within the framework of variational calculus, their solution for complex systems is often analytically and computationally intractable.
1 code implementation • 10 Jan 2021 • Lucas Böttcher, Maria D'Orsogna, Tom Chou
We find that the average excess death across the entire US is 13$\%$ higher than the number of reported COVID-19 deaths.
1 code implementation • 2 Aug 2020 • Sascha Wald, Lucas Böttcher
We study the influence of quantum effects on the stationary and long-time average probability distribution by interpolating between the classical and quantum regime.
Statistical Mechanics Quantum Physics
1 code implementation • 17 Jun 2020 • Thomas Asikis, Lucas Böttcher, Nino Antulov-Fantulin
We study the ability of neural networks to calculate feedback control signals that steer trajectories of continuous time non-linear dynamical systems on graphs, which we represent with neural ordinary differential equations (neural ODEs).
1 code implementation • 15 Jan 2020 • Francesco D'Angelo, Lucas Böttcher
We also find that convolutional layers in VAEs are important to model spin correlations whereas RBMs achieve similar or even better performances without convolutional filters.