Search Results for author: Ashesh Chattopadhyay

Found 13 papers, 10 papers with code

OceanNet: A principled neural operator-based digital twin for regional oceans

no code implementations1 Oct 2023 Ashesh Chattopadhyay, Michael Gray, Tianning Wu, Anna B. Lowe, Ruoying He

While data-driven approaches demonstrate great potential in atmospheric modeling and weather forecasting, ocean modeling poses distinct challenges due to complex bathymetry, land, vertical structure, and flow non-linearity.

Weather Forecasting

Learning Closed-form Equations for Subgrid-scale Closures from High-fidelity Data: Promises and Challenges

1 code implementation8 Jun 2023 Karan Jakhar, Yifei Guan, Rambod Mojgani, Ashesh Chattopadhyay, Pedram Hassanzadeh, Laura Zanna

There is growing interest in discovering interpretable, closed-form equations for subgrid-scale (SGS) closures/parameterizations of complex processes in Earth system.

Long-term instabilities of deep learning-based digital twins of the climate system: The cause and a solution

1 code implementation14 Apr 2023 Ashesh Chattopadhyay, Pedram Hassanzadeh

Owing to computational cost, physics-based digital twins, though long-term stable, are intractable for real-time decision-making.

Decision Making Learning Theory

Deep learning-enhanced ensemble-based data assimilation for high-dimensional nonlinear dynamical systems

no code implementations9 Jun 2022 Ashesh Chattopadhyay, Ebrahim Nabizadeh, Eviatar Bach, Pedram Hassanzadeh

With small ensembles, the estimated background error covariance matrix in the EnKF algorithm suffers from sampling error, leading to an erroneous estimate of the analysis state (initial condition for the next forecast cycle).

Long-term stability and generalization of observationally-constrained stochastic data-driven models for geophysical turbulence

1 code implementation9 May 2022 Ashesh Chattopadhyay, Jaideep Pathak, Ebrahim Nabizadeh, Wahid Bhimji, Pedram Hassanzadeh

In this paper, we propose a convolutional variational autoencoder-based stochastic data-driven model that is pre-trained on an imperfect climate model simulation from a 2-layer quasi-geostrophic flow and re-trained, using transfer learning, on a small number of noisy observations from a perfect simulation.

Transfer Learning Weather Forecasting

Towards physically consistent data-driven weather forecasting: Integrating data assimilation with equivariance-preserving deep spatial transformers

1 code implementation16 Mar 2021 Ashesh Chattopadhyay, Mustafa Mustafa, Pedram Hassanzadeh, Eviatar Bach, Karthik Kashinath

These components are 1) a deep spatial transformer added to the latent space of the U-NETs to preserve a property called equivariance, which is related to correctly capturing rotations and scalings of features in spatio-temporal data, 2) a data-assimilation (DA) algorithm to ingest noisy observations and improve the initial conditions for next forecasts, and 3) a multi-time-step algorithm, which combines forecasts from DDWP models with different time steps through DA, improving the accuracy of forecasts at short intervals.

Weather Forecasting

Data-driven super-parameterization using deep learning: Experimentation with multi-scale Lorenz 96 systems and transfer-learning

no code implementations25 Feb 2020 Ashesh Chattopadhyay, Adam Subel, Pedram Hassanzadeh

To make weather/climate modeling computationally affordable, small-scale processes are usually represented in terms of the large-scale, explicitly-resolved processes using physics-based or semi-empirical parameterization schemes.

Transfer Learning

Analog forecasting of extreme-causing weather patterns using deep learning

1 code implementation26 Jul 2019 Ashesh Chattopadhyay, Ebrahim Nabizadeh, Pedram Hassanzadeh

The trained networks predict the occurrence/region of cold or heat waves, only using Z500, with accuracies (recalls) of $69\%-45\%$ $(77\%-48\%)$ or $62\%-41\%$ $(73\%-47\%)$ $1-5$ days ahead.

Data-driven prediction of a multi-scale Lorenz 96 chaotic system using deep learning methods: Reservoir computing, ANN, and RNN-LSTM

4 code implementations20 Jun 2019 Ashesh Chattopadhyay, Pedram Hassanzadeh, Devika Subramanian

This Lorenz 96 system has three tiers of nonlinearly interacting variables representing slow/large-scale ($X$), intermediate ($Y$), and fast/small-scale ($Z$) processes.

A test case for application of convolutional neural networks to spatio-temporal climate data: Re-identifying clustered weather patterns

1 code implementation12 Nov 2018 Ashesh Chattopadhyay, Pedram Hassanzadeh, Saba Pasha

To address these challenges, here we (1) Propose an effective auto-labeling strategy based on using an unsupervised clustering algorithm and evaluating the performance of CNNs in re-identifying these clusters; (2) Use this approach to label thousands of daily large-scale weather patterns over North America in the outputs of a fully-coupled climate model and show the capabilities of CNNs in re-identifying the 4 clustered regimes.

Clustering Test

Cannot find the paper you are looking for? You can Submit a new open access paper.