ODEN: A Framework to Solve Ordinary Differential Equations using Artificial Neural Networks

28 May 2020  ·  Liam L. H. Lau, Denis Werth ·

We explore in detail a method to solve ordinary differential equations using feedforward neural networks. We prove a specific loss function, which does not require knowledge of the exact solution, to be a suitable standard metric to evaluate neural networks' performance. Neural networks are shown to be proficient at approximating continuous solutions within their training domains. We illustrate neural networks' ability to outperform traditional standard numerical techniques. Training is thoroughly examined and three universal phases are found: (i) a prior tangent adjustment, (ii) a curvature fitting, and (iii) a fine-tuning stage. The main limitation of the method is the nontrivial task of finding the appropriate neural network architecture and the choice of neural network hyperparameters for efficient optimization. However, we observe an optimal architecture that matches the complexity of the differential equation. A user-friendly and adaptable open-source code (ODE$\mathcal{N}$) is provided on GitHub.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here