no code implementations • 14 Jul 2020 • Walid Krichene, Kenneth F. Caluya, Abhishek Halder
Recent results have shown that for two-layer fully connected neural networks, gradient flow converges to a global optimum in the infinite width limit, by making a connection between the mean field dynamics and the Wasserstein gradient flow.
no code implementations • 31 Mar 2020 • Kenneth F. Caluya, Abhishek Halder
How to steer a given joint state probability density function to another over finite horizon subject to a controlled stochastic dynamics with hard state (sample path) constraints?
no code implementations • 4 Aug 2019 • Abhishek Halder, Kenneth F. Caluya, Bertrand Travacca, Scott J. Moura
We provide gradient flow interpretations for the continuous-time continuous-state Hopfield neural network (HNN).
no code implementations • 1 Aug 2019 • Kenneth F. Caluya, Abhishek Halder
The need for computing the transient joint PDFs subject to prior dynamics arises in uncertainty propagation, nonlinear filtering and stochastic control.
1 code implementation • 28 Sep 2018 • Kenneth F. Caluya, Abhishek Halder
We develop a new method to solve the Fokker-Planck or Kolmogorov's forward equation that governs the time evolution of the joint probability density function of a continuous-time stochastic nonlinear system.
Optimization and Control