no code implementations • 16 Mar 2013 • Feng Tan, Jean-Jacques Slotine
The algorithm treats each data as a single cell, and uses knowledge of local connectivity to cluster cells into multiple colonies simultaneously.
no code implementations • 25 Apr 2018 • Winfried Lohmiller, Philipp Gassert, Jean-Jacques Slotine
We discuss technical results on learning function approximations using piecewise-linear basis functions, and analyze their stability and convergence using nonlinear contraction theory.
no code implementations • ICLR Workshop DeepDiffEq 2019 • Jared Quincy Davis, Krzysztof Choromanski, Jake Varley, Honglak Lee, Jean-Jacques Slotine, Valerii Likhosterov, Adrian Weller, Ameesh Makadia, Vikas Sindhwani
Neural Ordinary Differential Equations (ODEs) are elegant reinterpretations of deep networks where continuous time can replace the discrete notion of depth, ODE solvers perform forward propagation, and the adjoint method enables efficient, constant memory backpropagation.
no code implementations • NeurIPS 2020 • Krzysztof Choromanski, Jared Quincy Davis, Valerii Likhosherstov, Xingyou Song, Jean-Jacques Slotine, Jacob Varley, Honglak Lee, Adrian Weller, Vikas Sindhwani
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the orthogonal group O(d).
no code implementations • NeurIPS 2020 • Krzysztof M. Choromanski, Jared Quincy Davis, Valerii Likhosherstov, Xingyou Song, Jean-Jacques Slotine, Jacob Varley, Honglak Lee, Adrian Weller, Vikas Sindhwani
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the orthogonal group O(d).
no code implementations • 4 Mar 2021 • Hiroyasu Tsukamoto, Soon-Jo Chung, Jean-Jacques Slotine
Adaptive control is subject to stability and performance issues when a learned model is used to enhance its performance.
1 code implementation • 7 Mar 2021 • Spencer M. Richards, Navid Azizan, Jean-Jacques Slotine, Marco Pavone
Real-time adaptation is imperative to the control of robots operating in complex, dynamic environments.
1 code implementation • ICCV 2021 • Heng Yang, Chris Doran, Jean-Jacques Slotine
We study the problem of aligning two sets of 3D geometric primitives given known correspondences.
1 code implementation • 16 Jun 2021 • Leo Kozachkov, Michaela Ennis, Jean-Jacques Slotine
Recurrent neural networks (RNNs) are widely used throughout neuroscience as models of local neural activity.
no code implementations • 2 Oct 2021 • Hiroyasu Tsukamoto, Soon-Jo Chung, Jean-Jacques Slotine, Chuchu Fan
This paper presents a theoretical overview of a Neural Contraction Metric (NCM): a neural network model of an optimal contraction metric and corresponding differential Lyapunov function, the existence of which is a necessary and sufficient condition for incremental exponential stability of non-autonomous nonlinear system trajectories.
no code implementations • 17 Jan 2022 • Leo Kozachkov, Patrick M. Wensing, Jean-Jacques Slotine
We prove that Riemannian contraction in a supervised learning setting implies generalization.
1 code implementation • 14 Apr 2022 • Spencer M. Richards, Navid Azizan, Jean-Jacques Slotine, Marco Pavone
Real-time adaptation is imperative to the control of robots operating in complex, dynamic environments.
no code implementations • 28 Jul 2022 • Brett T. Lopez, Jean-Jacques Slotine
This work applies universal adaptive control to control barrier functions to achieve forward invariance of a safe set despite the presence of unmatched parametric uncertainties.
1 code implementation • 6 Feb 2023 • Spencer M. Richards, Jean-Jacques Slotine, Navid Azizan, Marco Pavone
Even for known nonlinear dynamical systems, feedback controller synthesis is a difficult problem that often requires leveraging the particular structure of the dynamics to induce a stable closed-loop system.
1 code implementation • 8 Jun 2023 • Carlos Esteves, Jean-Jacques Slotine, Ameesh Makadia
Spherical CNNs generalize CNNs to functions on the sphere, by using spherical convolutions as the main linear operation.
no code implementations • 15 Jun 2023 • Winfried Lohmiller, Philipp Gassert, Jean-Jacques Slotine
Global exponential convergence of the algorithm is established using Contraction Theory with Inequality Constraints, which is extended from the continuous to the discrete case in this paper: The parametrization of each linear function piece is, in contrast to deep learning, linear in the proposed MinMax network.
no code implementations • 14 Sep 2023 • Ron Ofir, Jean-Jacques Slotine, Michael Margaliot
We derive a sufficient condition for $k$-contraction in a generalized Lurie system (GLS), that is, the feedback connection of a nonlinear dynamical system and a memoryless nonlinear function.
no code implementations • 2 Oct 2023 • Michaela Ennis, Leo Kozachkov, Jean-Jacques Slotine
To push forward the important emerging research field surrounding multi-area recurrent neural networks (RNNs), we expand theoretically and empirically on the provably stable RNNs of RNNs introduced by Kozachkov et al. in "RNNs of RNNs: Recursive Construction of Stable Assemblies of Recurrent Neural Networks".
no code implementations • 7 Nov 2023 • Bing Song, Jean-Jacques Slotine, Quang-Cuong Pham
We propose a novel way to integrate control techniques with reinforcement learning (RL) for stability, robustness, and generalization: leveraging contraction theory to realize modularity in neural control, which ensures that combining stable subsystems can automatically preserve the stability.
no code implementations • 9 Nov 2023 • Brett T. Lopez, Jean-Jacques Slotine
We present a new direct adaptive control approach for nonlinear systems with unmatched and matched uncertainties.
no code implementations • 14 Nov 2023 • Leo Kozachkov, Jean-Jacques Slotine, Dmitry Krotov
Such multi-neuron synapses are ubiquitous in models of Dense Associative Memory (also known as Modern Hopfield Networks) and are known to lead to superlinear memory storage capacity, which is a desirable computational feature.