Neural Partial Differential Equations

1 Jan 2021  ·  Jungeun Kim, Seunghyun Hwang, Jihyun Hwang, Kookjin Lee, Dongeun Lee, Noseong Park ·

Neural ordinary differential equations (neural ODEs) introduced an approach to approximate a neural network as a system of ODEs after considering its layer as a continuous variable and discretizing its hidden dimension. While having a good characteristic that their required numbers of parameters are typically much smaller than those of conventional neural networks, neural ODEs are known to be numerically unstable and slow in solving their integral problems, resulting in the error and procrastination of the forward-pass computation. In this work, we present a novel partial differential equation (PDE)-based approach that removes the necessity of solving integral problems and considers both the layer and the hidden dimension as continuous variables. Owing to the recent advancement of learning PDEs, the presented novel concept, called \emph{neural partial differential equations} (neural PDEs), can be implemented. Our method shows comparable (or better) accuracies in much shorter forward-pass inference time for various tasks in comparison with neural ODEs.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here