Search Results for author: Pedram Hassanzadeh

Found 17 papers, 13 papers with code

Fourier analysis of the physics of transfer learning for data-driven subgrid-scale models of ocean turbulence

no code implementations21 Apr 2025 Moein Darman, Pedram Hassanzadeh, Laure Zanna, Ashesh Chattopadhyay

By re-training only one layer with data from the target system, this underestimation is corrected, enabling the NN to produce predictions that match the target spectra.

Transfer Learning

On the importance of learning non-local dynamics for stable data-driven climate modeling: A 1D gravity wave-QBO testbed

1 code implementation7 Jul 2024 Hamid A. Pahlavan, Pedram Hassanzadeh, M. Joan Alexander

We also demonstrate that learning non-local dynamics is crucial for the stability and accuracy of a data-driven spatiotemporal emulator of the zonal wind field.

Extreme Event Prediction with Multi-agent Reinforcement Learning-based Parametrization of Atmospheric and Oceanic Turbulence

no code implementations1 Dec 2023 Rambod Mojgani, Daniel Waelchli, Yifei Guan, Petros Koumoutsakos, Pedram Hassanzadeh

Reinforcement learning is emerging as a potent alternative for developing such closures as it requires only low-order statistics and leads to stable closures.

Multi-agent Reinforcement Learning

Learning Closed-form Equations for Subgrid-scale Closures from High-fidelity Data: Promises and Challenges

2 code implementations8 Jun 2023 Karan Jakhar, Yifei Guan, Rambod Mojgani, Ashesh Chattopadhyay, Pedram Hassanzadeh

These closures depend on nonlinear combinations of gradients of filtered variables, with constants that are independent of the fluid/flow properties and only depend on filter type/size.

Equation Discovery Form

Challenges of learning multi-scale dynamics with AI weather models: Implications for stability and one solution

1 code implementation14 Apr 2023 Ashesh Chattopadhyay, Y. Qiang Sun, Pedram Hassanzadeh

Long-term stability and physical consistency are critical properties for AI-based weather models if they are going to be used for subseasonal-to-seasonal forecasts or beyond, e. g., climate change projection.

Decision Making Deep Learning +2

Deep learning-enhanced ensemble-based data assimilation for high-dimensional nonlinear dynamical systems

no code implementations9 Jun 2022 Ashesh Chattopadhyay, Ebrahim Nabizadeh, Eviatar Bach, Pedram Hassanzadeh

With small ensembles, the estimated background error covariance matrix in the EnKF algorithm suffers from sampling error, leading to an erroneous estimate of the analysis state (initial condition for the next forecast cycle).

Long-term stability and generalization of observationally-constrained stochastic data-driven models for geophysical turbulence

1 code implementation9 May 2022 Ashesh Chattopadhyay, Jaideep Pathak, Ebrahim Nabizadeh, Wahid Bhimji, Pedram Hassanzadeh

In this paper, we propose a convolutional variational autoencoder-based stochastic data-driven model that is pre-trained on an imperfect climate model simulation from a 2-layer quasi-geostrophic flow and re-trained, using transfer learning, on a small number of noisy observations from a perfect simulation.

Transfer Learning Weather Forecasting

Towards physically consistent data-driven weather forecasting: Integrating data assimilation with equivariance-preserving deep spatial transformers

1 code implementation16 Mar 2021 Ashesh Chattopadhyay, Mustafa Mustafa, Pedram Hassanzadeh, Eviatar Bach, Karthik Kashinath

These components are 1) a deep spatial transformer added to the latent space of the U-NETs to preserve a property called equivariance, which is related to correctly capturing rotations and scalings of features in spatio-temporal data, 2) a data-assimilation (DA) algorithm to ingest noisy observations and improve the initial conditions for next forecasts, and 3) a multi-time-step algorithm, which combines forecasts from DDWP models with different time steps through DA, improving the accuracy of forecasts at short intervals.

Weather Forecasting

Data-driven super-parameterization using deep learning: Experimentation with multi-scale Lorenz 96 systems and transfer-learning

no code implementations25 Feb 2020 Ashesh Chattopadhyay, Adam Subel, Pedram Hassanzadeh

To make weather/climate modeling computationally affordable, small-scale processes are usually represented in terms of the large-scale, explicitly-resolved processes using physics-based or semi-empirical parameterization schemes.

Transfer Learning

Analog forecasting of extreme-causing weather patterns using deep learning

1 code implementation26 Jul 2019 Ashesh Chattopadhyay, Ebrahim Nabizadeh, Pedram Hassanzadeh

The trained networks predict the occurrence/region of cold or heat waves, only using Z500, with accuracies (recalls) of $69\%-45\%$ $(77\%-48\%)$ or $62\%-41\%$ $(73\%-47\%)$ $1-5$ days ahead.

Deep Learning

Data-driven prediction of a multi-scale Lorenz 96 chaotic system using deep learning methods: Reservoir computing, ANN, and RNN-LSTM

4 code implementations20 Jun 2019 Ashesh Chattopadhyay, Pedram Hassanzadeh, Devika Subramanian

This Lorenz 96 system has three tiers of nonlinearly interacting variables representing slow/large-scale ($X$), intermediate ($Y$), and fast/small-scale ($Z$) processes.

A test case for application of convolutional neural networks to spatio-temporal climate data: Re-identifying clustered weather patterns

1 code implementation12 Nov 2018 Ashesh Chattopadhyay, Pedram Hassanzadeh, Saba Pasha

To address these challenges, here we (1) Propose an effective auto-labeling strategy based on using an unsupervised clustering algorithm and evaluating the performance of CNNs in re-identifying these clusters; (2) Use this approach to label thousands of daily large-scale weather patterns over North America in the outputs of a fully-coupled climate model and show the capabilities of CNNs in re-identifying the 4 clustered regimes.

Clustering

Cannot find the paper you are looking for? You can Submit a new open access paper.