Search Results for author: Lyudmila Grigoryeva

Found 17 papers, 4 papers with code

Memory of recurrent networks: Do we compute it right?

1 code implementation2 May 2023 Giovanni Ballarin, Lyudmila Grigoryeva, Juan-Pablo Ortega

Numerical evaluations of the memory capacity (MC) of recurrent neural networks reported in the literature often contradict well-established theoretical bounds.

Infinite-dimensional reservoir computing

no code implementations2 Apr 2023 Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega

The reservoir architectures used for the approximation and estimation of elements in the new class are randomly generated echo state networks with either linear or ReLU activation functions.

Generalization Bounds

Reservoir kernels and Volterra series

1 code implementation30 Dec 2022 Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega

A universal kernel is constructed whose sections approximate any causal and time-invariant filter in the fading memory category with inputs and outputs in a finite-dimensional Euclidean space.

Reservoir Computing for Macroeconomic Forecasting with Mixed Frequency Data

1 code implementation1 Nov 2022 Giovanni Ballarin, Petros Dellaportas, Lyudmila Grigoryeva, Marcel Hirt, Sophie van Huellen, Juan-Pablo Ortega

Macroeconomic forecasting has recently started embracing techniques that can deal with large-scale datasets and series with unequal release periods.

Expressive Power of Randomized Signature

no code implementations NeurIPS Workshop DLDE 2021 Christa Cuchiero, Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega, Josef Teichmann

We consider the question whether the time evolution of controlled differential equations on general state spaces can be arbitrarily well approximated by (regularized) regressions on features generated themselves through randomly chosen dynamical systems of moderately high dimension.

LEMMA Transfer Learning

Learning strange attractors with reservoir systems

no code implementations11 Aug 2021 Lyudmila Grigoryeva, Allen Hart, Juan-Pablo Ortega

This paper shows that the celebrated Embedding Theorem of Takens is a particular case of a much more general statement according to which, randomly generated linear state-space representations of generic observations of an invertible dynamical system carry in their wake an embedding of the phase space dynamics into the chosen Euclidean state space.

Representation Learning

Discrete-time signatures and randomness in reservoir computing

no code implementations17 Sep 2020 Christa Cuchiero, Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega, Josef Teichmann

A new explanation of geometric nature of the reservoir computing phenomenon is presented.

Dimension reduction in recurrent networks by canonicalization

no code implementations23 Jul 2020 Lyudmila Grigoryeva, Juan-Pablo Ortega

Many recurrent neural network machine learning paradigms can be formulated using state-space representations.

Dimensionality Reduction

Memory and forecasting capacities of nonlinear recurrent networks

no code implementations22 Apr 2020 Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega

The notion of memory capacity, originally introduced for echo state and linear networks with independent inputs, is generalized to nonlinear recurrent networks with stationary but dependent inputs.

Time Series Time Series Analysis

Approximation Bounds for Random Neural Networks and Reservoir Systems

no code implementations14 Feb 2020 Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega

This work studies approximation based on single-hidden-layer feedforward and recurrent neural networks with randomly generated internal weights.

Risk bounds for reservoir computing

no code implementations30 Oct 2019 Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega

We analyze the practices of reservoir computing in the framework of statistical learning theory.

Learning Theory

Differentiable reservoir computing

no code implementations16 Feb 2019 Lyudmila Grigoryeva, Juan-Pablo Ortega

That research is complemented in this paper with the characterization of the differentiability of reservoir filters for very general classes of discrete-time deterministic inputs.

Echo state networks are universal

no code implementations3 Jun 2018 Lyudmila Grigoryeva, Juan-Pablo Ortega

This paper shows that echo state networks are universal uniform approximants in the context of discrete-time fading memory filters with uniformly bounded inputs defined on negative infinite times.

valid

Singular ridge regression with homoscedastic residuals: generalization error with estimated parameters

no code implementations29 May 2016 Lyudmila Grigoryeva, Juan-Pablo Ortega

This paper characterizes the conditional distribution properties of the finite sample ridge regression estimator and uses that result to evaluate total regression and generalization errors that incorporate the inaccuracies committed at the time of parameter estimation.

regression

Nonlinear memory capacity of parallel time-delay reservoir computers in the processing of multidimensional signals

no code implementations13 Oct 2015 Lyudmila Grigoryeva, Julie Henriques, Laurent Larger, Juan-Pablo Ortega

This paper addresses the reservoir design problem in the context of delay-based reservoir computers for multidimensional input signals, parallel architectures, and real-time multitasking.

Cannot find the paper you are looking for? You can Submit a new open access paper.