Search Results for author: Henry D. I. Abarbanel

Found 6 papers, 1 papers with code

A Systematic Exploration of Reservoir Computing for Forecasting Complex Spatiotemporal Dynamics

no code implementations21 Jan 2022 Jason A. Platt, Stephen G. Penny, Timothy A. Smith, Tse-Chun Chen, Henry D. I. Abarbanel

While we are not aware of a generally accepted best reported mean forecast time for different models in the literature, we report over a factor of 2 increase in the mean forecast time compared to the best performing RC model of Vlachas et. al (2020) for the 40 dimensional spatiotemporally chaotic Lorenz 1996 dynamics, and we are able to accomplish this using a smaller reservoir size.

Integrating Recurrent Neural Networks with Data Assimilation for Scalable Data-Driven State Estimation

no code implementations25 Sep 2021 Stephen G. Penny, Timothy A. Smith, Tse-Chun Chen, Jason A. Platt, Hsin-Yi Lin, Michael Goodliff, Henry D. I. Abarbanel

The results indicate that these techniques can be applied to estimate the state of a system for the repeated initialization of short-term forecasts, even in the absence of a traditional numerical forecast model.

Forecasting Using Reservoir Computing: The Role of Generalized Synchronization

no code implementations4 Feb 2021 Jason A. Platt, Adrian Wong, Randall Clark, Stephen G. Penny, Henry D. I. Abarbanel

Reservoir computers (RC) are a form of recurrent neural network (RNN) used for forecasting time series data.

Time Series

Machine Learning of Time Series Using Time-delay Embedding and Precision Annealing

no code implementations12 Feb 2019 Alexander J. A. Ty, Zheng Fang, Rivver A. Gonzalez, Paul J. Rozdeba, Henry D. I. Abarbanel

We proceed from a scalar time series $s(t_n); t_n = t_0 + n \Delta t$ and using methods of nonlinear time series analysis show how to produce a $D_E > 1$ dimensional time delay embedding space in which the time series has no false neighbors as does the observed $s(t_n)$ time series.

BIG-bench Machine Learning Time Series +1

Cannot find the paper you are looking for? You can Submit a new open access paper.