Search Results for author: Nikola Kovachki

Found 10 papers, 7 papers with code

Multi-Grid Tensorized Fourier Neural Operator for High-Resolution PDEs

no code implementations29 Sep 2023 Jean Kossaifi, Nikola Kovachki, Kamyar Azizzadenesheli, Anima Anandkumar

Our contributions are threefold: i) we enable parallelization over input samples with a novel multi-grid-based domain decomposition, ii) we represent the parameters of the model in a high-order latent subspace of the Fourier domain, through a global tensor factorization, resulting in an extreme reduction in the number of parameters and improved generalization, and iii) we propose architectural improvements to the backbone FNO.

Operator learning

Neural Operators for Accelerating Scientific Simulations and Design

no code implementations27 Sep 2023 Kamyar Azizzadenesheli, Nikola Kovachki, Zongyi Li, Miguel Liu-Schiaffini, Jean Kossaifi, Anima Anandkumar

Scientific discovery and engineering design are currently limited by the time and cost of physical experiments, selected mostly through trial-and-error and intuition that require deep domain expertise.

Super-Resolution Weather Forecasting

Tipping Point Forecasting in Non-Stationary Dynamics on Function Spaces

no code implementations17 Aug 2023 Miguel Liu-Schiaffini, Clare E. Singer, Nikola Kovachki, Tapio Schneider, Kamyar Azizzadenesheli, Anima Anandkumar

Tipping points are abrupt, drastic, and often irreversible changes in the evolution of non-stationary and chaotic dynamical systems.

Conformal Prediction

Learning Homogenization for Elliptic Operators

2 code implementations21 Jun 2023 Kaushik Bhattacharya, Nikola Kovachki, Aakila Rajan, Andrew M. Stuart, Margaret Trautner

However, a major challenge in data-driven learning approaches for this problem has remained unexplored: the impact of discontinuities and corner interfaces in the underlying material.

Neural Operator: Learning Maps Between Function Spaces

1 code implementation19 Aug 2021 Nikola Kovachki, Zongyi Li, Burigede Liu, Kamyar Azizzadenesheli, Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar

The classical development of neural networks has primarily focused on learning mappings between finite dimensional Euclidean spaces or finite sets.

Operator learning

Learning Dissipative Dynamics in Chaotic Systems

2 code implementations13 Jun 2021 Zongyi Li, Miguel Liu-Schiaffini, Nikola Kovachki, Burigede Liu, Kamyar Azizzadenesheli, Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar

Chaotic systems are notoriously challenging to predict because of their sensitivity to perturbations and errors due to time stepping.

Multipole Graph Neural Operator for Parametric Partial Differential Equations

4 code implementations NeurIPS 2020 Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar

One of the main challenges in using deep learning-based methods for simulating physical systems and solving partial differential equations (PDEs) is formulating physics-based data in the desired structure for neural networks.

Neural Operator: Graph Kernel Network for Partial Differential Equations

6 code implementations ICLR Workshop DeepDiffEq 2019 Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar

The classical development of neural networks has been primarily for mappings between a finite-dimensional Euclidean space and a set of classes, or between two finite-dimensional Euclidean spaces.

Cannot find the paper you are looking for? You can Submit a new open access paper.