no code implementations • 1 Dec 2023 • Rambod Mojgani, Daniel Waelchli, Yifei Guan, Petros Koumoutsakos, Pedram Hassanzadeh
Reinforcement learning is emerging as a potent alternative for developing such closures as it requires only low-order statistics and leads to stable closures.
1 code implementation • 8 Jun 2023 • Karan Jakhar, Yifei Guan, Rambod Mojgani, Ashesh Chattopadhyay, Pedram Hassanzadeh, Laura Zanna
There is growing interest in discovering interpretable, closed-form equations for subgrid-scale (SGS) closures/parameterizations of complex processes in Earth system.
1 code implementation • 14 Apr 2023 • Ashesh Chattopadhyay, Pedram Hassanzadeh
Owing to computational cost, physics-based digital twins, though long-term stable, are intractable for real-time decision-making.
no code implementations • 9 Jun 2022 • Ashesh Chattopadhyay, Ebrahim Nabizadeh, Eviatar Bach, Pedram Hassanzadeh
With small ensembles, the estimated background error covariance matrix in the EnKF algorithm suffers from sampling error, leading to an erroneous estimate of the analysis state (initial condition for the next forecast cycle).
1 code implementation • 7 Jun 2022 • Adam Subel, Yifei Guan, Ashesh Chattopadhyay, Pedram Hassanzadeh
For effective TL, we need to know 1) what are the best layers to re-train?
1 code implementation • 9 May 2022 • Ashesh Chattopadhyay, Jaideep Pathak, Ebrahim Nabizadeh, Wahid Bhimji, Pedram Hassanzadeh
In this paper, we propose a convolutional variational autoencoder-based stochastic data-driven model that is pre-trained on an imperfect climate model simulation from a 2-layer quasi-geostrophic flow and re-trained, using transfer learning, on a small number of noisy observations from a perfect simulation.
1 code implementation • 5 May 2022 • Rambod Mojgani, Maciej Balajewicz, Pedram Hassanzadeh
A parallel architecture with two branches is proposed.
3 code implementations • 22 Feb 2022 • Jaideep Pathak, Shashank Subramanian, Peter Harrington, Sanjeev Raja, Ashesh Chattopadhyay, Morteza Mardani, Thorsten Kurth, David Hall, Zongyi Li, Kamyar Azizzadenesheli, Pedram Hassanzadeh, Karthik Kashinath, Animashree Anandkumar
FourCastNet accurately forecasts high-resolution, fast-timescale variables such as the surface wind speed, precipitation, and atmospheric water vapor.
1 code implementation • 1 Oct 2021 • Rambod Mojgani, Ashesh Chattopadhyay, Pedram Hassanzadeh
Models of many engineering and natural systems are imperfect.
1 code implementation • 16 Mar 2021 • Ashesh Chattopadhyay, Mustafa Mustafa, Pedram Hassanzadeh, Eviatar Bach, Karthik Kashinath
These components are 1) a deep spatial transformer added to the latent space of the U-NETs to preserve a property called equivariance, which is related to correctly capturing rotations and scalings of features in spatio-temporal data, 2) a data-assimilation (DA) algorithm to ingest noisy observations and improve the initial conditions for next forecasts, and 3) a multi-time-step algorithm, which combines forecasts from DDWP models with different time steps through DA, improving the accuracy of forecasts at short intervals.
no code implementations • 25 Feb 2020 • Ashesh Chattopadhyay, Adam Subel, Pedram Hassanzadeh
To make weather/climate modeling computationally affordable, small-scale processes are usually represented in terms of the large-scale, explicitly-resolved processes using physics-based or semi-empirical parameterization schemes.
1 code implementation • 26 Jul 2019 • Ashesh Chattopadhyay, Ebrahim Nabizadeh, Pedram Hassanzadeh
The trained networks predict the occurrence/region of cold or heat waves, only using Z500, with accuracies (recalls) of $69\%-45\%$ $(77\%-48\%)$ or $62\%-41\%$ $(73\%-47\%)$ $1-5$ days ahead.
4 code implementations • 20 Jun 2019 • Ashesh Chattopadhyay, Pedram Hassanzadeh, Devika Subramanian
This Lorenz 96 system has three tiers of nonlinearly interacting variables representing slow/large-scale ($X$), intermediate ($Y$), and fast/small-scale ($Z$) processes.
1 code implementation • 12 Nov 2018 • Ashesh Chattopadhyay, Pedram Hassanzadeh, Saba Pasha
To address these challenges, here we (1) Propose an effective auto-labeling strategy based on using an unsupervised clustering algorithm and evaluating the performance of CNNs in re-identifying these clusters; (2) Use this approach to label thousands of daily large-scale weather patterns over North America in the outputs of a fully-coupled climate model and show the capabilities of CNNs in re-identifying the 4 clustered regimes.