Search Results for author: Max Revay

Found 9 papers, 3 papers with code

Learning over All Stabilizing Nonlinear Controllers for a Partially-Observed Linear System

no code implementations8 Dec 2021 Ruigang Wang, Nicholas H. Barbara, Max Revay, Ian R. Manchester

This paper proposes a nonlinear policy architecture for control of partially-observed linear dynamical systems providing built-in closed-loop stability guarantees.

Reinforcement Learning (RL)

Contraction-Based Methods for Stable Identification and Robust Machine Learning: a Tutorial

no code implementations1 Oct 2021 Ian R. Manchester, Max Revay, Ruigang Wang

This tutorial paper provides an introduction to recently developed tools for machine learning, especially learning dynamical systems (system identification), with stability and robustness constraints.

BIG-bench Machine Learning

Distributed Identification of Contracting and/or Monotone Network Dynamics

no code implementations29 Jul 2021 Max Revay, Jack Umenberger, Ian R. Manchester

This paper proposes methods for identification of large-scale networked systems with guarantees that the resulting model will be contracting -- a strong form of nonlinear stability -- and/or monotone, i. e. order relations between states are preserved.

Recurrent Equilibrium Networks: Flexible Dynamic Models with Guaranteed Stability and Robustness

1 code implementation13 Apr 2021 Max Revay, Ruigang Wang, Ian R. Manchester

RENs are otherwise very flexible: they can represent all stable linear systems, all previously-known sets of contracting recurrent neural networks and echo state networks, all deep feedforward neural networks, and all stable Wiener/Hammerstein models, and can approximate all fading-memory and contracting nonlinear systems.

Lipschitz-Bounded Equilibrium Networks

no code implementations1 Jan 2021 Max Revay, Ruigang Wang, Ian Manchester

In image classification experiments we show that the Lipschitz bounds are very accurate and improve robustness to adversarial attacks.

Generalization Bounds Image Classification

Lipschitz Bounded Equilibrium Networks

no code implementations5 Oct 2020 Max Revay, Ruigang Wang, Ian R. Manchester

In image classification experiments we show that the Lipschitz bounds are very accurate and improve robustness to adversarial attacks.

Generalization Bounds Image Classification

A Convex Parameterization of Robust Recurrent Neural Networks

no code implementations11 Apr 2020 Max Revay, Ruigang Wang, Ian R. Manchester

Recurrent neural networks (RNNs) are a class of nonlinear dynamical systems often used to model sequence-to-sequence maps.

Contracting Implicit Recurrent Neural Networks: Stable Models with Improved Trainability

1 code implementation L4DC 2020 Max Revay, Ian R. Manchester

Stability of recurrent models is closely linked with trainability, generalizability and in some applications, safety.

Cannot find the paper you are looking for? You can Submit a new open access paper.