Search Results for author: Stephen G. Penny

Found 4 papers, 0 papers with code

A Systematic Exploration of Reservoir Computing for Forecasting Complex Spatiotemporal Dynamics

no code implementations21 Jan 2022 Jason A. Platt, Stephen G. Penny, Timothy A. Smith, Tse-Chun Chen, Henry D. I. Abarbanel

While we are not aware of a generally accepted best reported mean forecast time for different models in the literature, we report over a factor of 2 increase in the mean forecast time compared to the best performing RC model of Vlachas et. al (2020) for the 40 dimensional spatiotemporally chaotic Lorenz 1996 dynamics, and we are able to accomplish this using a smaller reservoir size.

`Next Generation' Reservoir Computing: an Empirical Data-Driven Expression of Dynamical Equations in Time-Stepping Form

no code implementations13 Jan 2022 Tse-Chun Chen, Stephen G. Penny, Timothy A. Smith, Jason A. Platt

Next generation reservoir computing based on nonlinear vector autoregression (NVAR) is applied to emulate simple dynamical system models and compared to numerical integration schemes such as Euler and the $2^\text{nd}$ order Runge-Kutta.

Numerical Integration

Integrating Recurrent Neural Networks with Data Assimilation for Scalable Data-Driven State Estimation

no code implementations25 Sep 2021 Stephen G. Penny, Timothy A. Smith, Tse-Chun Chen, Jason A. Platt, Hsin-Yi Lin, Michael Goodliff, Henry D. I. Abarbanel

The results indicate that these techniques can be applied to estimate the state of a system for the repeated initialization of short-term forecasts, even in the absence of a traditional numerical forecast model.

Forecasting Using Reservoir Computing: The Role of Generalized Synchronization

no code implementations4 Feb 2021 Jason A. Platt, Adrian Wong, Randall Clark, Stephen G. Penny, Henry D. I. Abarbanel

Reservoir computers (RC) are a form of recurrent neural network (RNN) used for forecasting time series data.

Time Series

Cannot find the paper you are looking for? You can Submit a new open access paper.