no code implementations • 28 Dec 2022 • Xingsheng Sun, Burigede Liu
This paper concerns the study of optimal (supremum and infimum) uncertainty bounds for systems where the input (or prior) probability measure is only partially/imperfectly known (e. g., with only statistical moments and/or on a coarse topology) rather than fully specified.
7 code implementations • 11 Jul 2022 • Zongyi Li, Daniel Zhengyu Huang, Burigede Liu, Anima Anandkumar
The resulting geo-FNO model has both the computation efficiency of FFT and the flexibility of handling arbitrary geometries.
6 code implementations • 6 Nov 2021 • Zongyi Li, Hongkai Zheng, Nikola Kovachki, David Jin, Haoxuan Chen, Burigede Liu, Kamyar Azizzadenesheli, Anima Anandkumar
Specifically, in PINO, we combine coarse-resolution training data with PDE constraints imposed at a higher resolution.
1 code implementation • 19 Aug 2021 • Nikola Kovachki, Zongyi Li, Burigede Liu, Kamyar Azizzadenesheli, Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar
The classical development of neural networks has primarily focused on learning mappings between finite dimensional Euclidean spaces or finite sets.
2 code implementations • 13 Jun 2021 • Zongyi Li, Miguel Liu-Schiaffini, Nikola Kovachki, Burigede Liu, Kamyar Azizzadenesheli, Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar
Chaotic systems are notoriously challenging to predict because of their sensitivity to perturbations and errors due to time stepping.
19 code implementations • ICLR 2021 • Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar
The classical development of neural networks has primarily focused on learning mappings between finite-dimensional Euclidean spaces.
4 code implementations • NeurIPS 2020 • Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar
One of the main challenges in using deep learning-based methods for simulating physical systems and solving partial differential equations (PDEs) is formulating physics-based data in the desired structure for neural networks.
6 code implementations • ICLR Workshop DeepDiffEq 2019 • Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar
The classical development of neural networks has been primarily for mappings between a finite-dimensional Euclidean space and a set of classes, or between two finite-dimensional Euclidean spaces.