1 code implementation • 22 Jun 2023 • Nicholas H. Barbara, Max Revay, Ruigang Wang, Jing Cheng, Ian R. Manchester
Neural networks are typically sensitive to small input perturbations, leading to unexpected or brittle behaviour.
no code implementations • 8 Dec 2021 • Ruigang Wang, Nicholas H. Barbara, Max Revay, Ian R. Manchester
This paper proposes a nonlinear policy architecture for control of partially-observed linear dynamical systems providing built-in closed-loop stability guarantees.
no code implementations • 1 Oct 2021 • Ian R. Manchester, Max Revay, Ruigang Wang
This tutorial paper provides an introduction to recently developed tools for machine learning, especially learning dynamical systems (system identification), with stability and robustness constraints.
no code implementations • 29 Jul 2021 • Max Revay, Jack Umenberger, Ian R. Manchester
This paper proposes methods for identification of large-scale networked systems with guarantees that the resulting model will be contracting -- a strong form of nonlinear stability -- and/or monotone, i. e. order relations between states are preserved.
1 code implementation • 13 Apr 2021 • Max Revay, Ruigang Wang, Ian R. Manchester
RENs are otherwise very flexible: they can represent all stable linear systems, all previously-known sets of contracting recurrent neural networks and echo state networks, all deep feedforward neural networks, and all stable Wiener/Hammerstein models, and can approximate all fading-memory and contracting nonlinear systems.
no code implementations • 1 Jan 2021 • Max Revay, Ruigang Wang, Ian Manchester
In image classification experiments we show that the Lipschitz bounds are very accurate and improve robustness to adversarial attacks.
no code implementations • 5 Oct 2020 • Max Revay, Ruigang Wang, Ian R. Manchester
In image classification experiments we show that the Lipschitz bounds are very accurate and improve robustness to adversarial attacks.
no code implementations • 11 Apr 2020 • Max Revay, Ruigang Wang, Ian R. Manchester
Recurrent neural networks (RNNs) are a class of nonlinear dynamical systems often used to model sequence-to-sequence maps.
1 code implementation • L4DC 2020 • Max Revay, Ian R. Manchester
Stability of recurrent models is closely linked with trainability, generalizability and in some applications, safety.