Search Results for author: Lyudmila Grigoryeva

Found 12 papers, 1 papers with code

Learning strange attractors with reservoir systems

no code implementations11 Aug 2021 Lyudmila Grigoryeva, Allen Hart, Juan-Pablo Ortega

This paper shows that the celebrated Embedding Theorem of Takens is a particular case of a much more general statement according to which, randomly generated linear state-space representations of generic observations of an invertible dynamical system carry in their wake an embedding of the phase space dynamics into the chosen Euclidean state space.

Representation Learning

Discrete-time signatures and randomness in reservoir computing

no code implementations17 Sep 2020 Christa Cuchiero, Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega, Josef Teichmann

A new explanation of geometric nature of the reservoir computing phenomenon is presented.

Dimension reduction in recurrent networks by canonicalization

no code implementations23 Jul 2020 Lyudmila Grigoryeva, Juan-Pablo Ortega

Many recurrent neural network machine learning paradigms can be formulated using state-space representations.

Dimensionality Reduction

Memory and forecasting capacities of nonlinear recurrent networks

no code implementations22 Apr 2020 Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega

The notion of memory capacity, originally introduced for echo state and linear networks with independent inputs, is generalized to nonlinear recurrent networks with stationary but dependent inputs.

Time Series

Approximation Bounds for Random Neural Networks and Reservoir Systems

no code implementations14 Feb 2020 Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega

This work studies approximation based on single-hidden-layer feedforward and recurrent neural networks with randomly generated internal weights.

Risk bounds for reservoir computing

no code implementations30 Oct 2019 Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega

We analyze the practices of reservoir computing in the framework of statistical learning theory.

Learning Theory

Differentiable reservoir computing

no code implementations16 Feb 2019 Lyudmila Grigoryeva, Juan-Pablo Ortega

That research is complemented in this paper with the characterization of the differentiability of reservoir filters for very general classes of discrete-time deterministic inputs.

Echo state networks are universal

no code implementations3 Jun 2018 Lyudmila Grigoryeva, Juan-Pablo Ortega

This paper shows that echo state networks are universal uniform approximants in the context of discrete-time fading memory filters with uniformly bounded inputs defined on negative infinite times.

Singular ridge regression with homoscedastic residuals: generalization error with estimated parameters

no code implementations29 May 2016 Lyudmila Grigoryeva, Juan-Pablo Ortega

This paper characterizes the conditional distribution properties of the finite sample ridge regression estimator and uses that result to evaluate total regression and generalization errors that incorporate the inaccuracies committed at the time of parameter estimation.

Nonlinear memory capacity of parallel time-delay reservoir computers in the processing of multidimensional signals

no code implementations13 Oct 2015 Lyudmila Grigoryeva, Julie Henriques, Laurent Larger, Juan-Pablo Ortega

This paper addresses the reservoir design problem in the context of delay-based reservoir computers for multidimensional input signals, parallel architectures, and real-time multitasking.

Cannot find the paper you are looking for? You can Submit a new open access paper.