Paper

Formal Synthesis of Lyapunov Neural Networks

We propose an automatic and formally sound method for synthesising Lyapunov functions for the asymptotic stability of autonomous non-linear systems. Traditional methods are either analytical and require manual effort or are numerical but lack of formal soundness. Symbolic computational methods for Lyapunov functions, which are in between, give formal guarantees but are typically semi-automatic because they rely on the user to provide appropriate function templates. We propose a method that finds Lyapunov functions fully automatically$-$using machine learning$-$while also providing formal guarantees$-$using satisfiability modulo theories (SMT). We employ a counterexample-guided approach where a numerical learner and a symbolic verifier interact to construct provably correct Lyapunov neural networks (LNNs). The learner trains a neural network that satisfies the Lyapunov criteria for asymptotic stability over a samples set; the verifier proves via SMT solving that the criteria are satisfied over the whole domain or augments the samples set with counterexamples. Our method supports neural networks with polynomial activation functions and multiple depth and width, which display wide learning capabilities. We demonstrate our method over several non-trivial benchmarks and compare it favourably against a numerical optimisation-based approach, a symbolic template-based approach, and a cognate LNN-based approach. Our method synthesises Lyapunov functions faster and over wider spatial domains than the alternatives, yet providing stronger or equal guarantees.

Results in Papers With Code
(↓ scroll down to see all results)