First, we provide necessary and sufficient conditions for their structural state and input observability that can be efficiently verified in $O((m(n+p))^2)$, where $n$ is the number of state variables, $p$ is the number of unknown inputs, and $m$ is the number of modes.
Out of the recent advances in systems and control (S\&C)-based analysis of optimization algorithms, not enough work has been specifically dedicated to machine learning (ML) algorithms and its applications.
Artificial neural networks, one of the most successful approaches to supervised learning, were originally inspired by their biological counterparts.
The Expectation-Maximization (EM) algorithm is one of the most popular methods used to solve the problem of parametric distribution-based clustering in unsupervised learning.
In this setting, we first consider a feasibility problem consisting of tuning the edge weights such that certain controllability properties are satisfied.
Optimization and Control
Furthermore, we propose to assess its convergence as asymptotic stability in the sense of Lyapunov.