Reconstructing a dynamical system and forecasting time series by self-consistent deep learning

4 Aug 2021  ·  Zhe Wang, Claude Guet ·

We introduce a self-consistent deep-learning framework which, for a noisy deterministic time series, provides unsupervised filtering, state-space reconstruction, identification of the underlying differential equations and forecasting. Without a priori information on the signal, we embed the time series in a state space, where deterministic structures, i.e. attractors, are revealed. Under the assumption that the evolution of solution trajectories is described by an unknown dynamical system, we filter out stochastic outliers. The embedding function, the solution trajectories and the dynamical systems are constructed using deep neural networks, respectively. By exploiting the differentiability of the neural solution trajectory, the neural dynamical system is defined locally at each time, mitigating the need for propagating gradients through numerical solvers. On a chaotic time series masked by additive Gaussian noise, we demonstrate the filtering ability and the predictive power of the proposed framework.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here