no code implementations • 2 Oct 2024 • Ignacio Lopez-Gomez, Zhong Yi Wan, Leonardo Zepeda-Núñez, Tapio Schneider, John Anderson, Fei Sha
Regional high-resolution climate projections are crucial for many applications, such as agriculture, hydrology, and natural hazard risk assessment.
no code implementations • 27 Sep 2024 • Roberto Molinaro, Samuel Lanthaler, Bogdan Raonić, Tobias Rohner, Victor Armegioiu, Zhong Yi Wan, Fei Sha, Siddhartha Mishra, Leonardo Zepeda-Núñez
We present a generative AI algorithm for addressing the challenging task of fast, accurate and robust statistical computation of three-dimensional turbulent fluid flows.
no code implementations • 13 Sep 2024 • Shantanu Shahane, Sheide Chammas, Deniz A. Bezgin, Aaron B. Buhendwa, Steffen J. Schmidt, Nikolaus A. Adams, Spencer H. Bryngelson, Yi-fan Chen, Qing Wang, Fei Sha, Leonardo Zepeda-Núñez
Conventional WENO3 methods are known to be highly dissipative at lower resolutions, introducing significant errors in the pre-asymptotic regime.
no code implementations • 5 Aug 2024 • Borong Zhang, Martín Guerra, Qin Li, Leonardo Zepeda-Núñez
We present Wideband back-projection diffusion, an end-to-end probabilistic framework for approximating the posterior distribution induced by the inverse scattering map from wideband scattering data.
no code implementations • 2 Aug 2024 • Benedikt Barthel Sorensen, Leonardo Zepeda-Núñez, Ignacio Lopez-Gomez, Zhong Yi Wan, Rob Carver, Fei Sha, Themistoklis Sapsis
In this work we present a general strategy for training neural network models to non-intrusively correct under-resolved long-time simulations of chaotic systems.
1 code implementation • 6 Feb 2024 • Yair Schiff, Zhong Yi Wan, Jeffrey B. Parker, Stephan Hoyer, Volodymyr Kuleshov, Fei Sha, Leonardo Zepeda-Núñez
Learning dynamics from dissipative chaotic systems is notoriously difficult due to their inherent instability, as formalized by their positive Lyapunov exponents, which exponentially amplify errors in the learned dynamics.
no code implementations • 13 Jun 2023 • Marc Finzi, Anudhyan Boral, Andrew Gordon Wilson, Fei Sha, Leonardo Zepeda-Núñez
In this work, we develop a probabilistic approximation scheme for the conditional score function which provably converges to the true distribution as the noise level decreases.
1 code implementation • NeurIPS 2023 • Zhong Yi Wan, Ricardo Baptista, Yi-fan Chen, John Anderson, Anudhyan Boral, Fei Sha, Leonardo Zepeda-Núñez
Moreover, our procedure correctly matches the statistics of physical quantities, even when the low-frequency content of the inputs and outputs do not match, a crucial but difficult-to-satisfy assumption needed by current state-of-the-art alternatives.
no code implementations • 25 Jan 2023 • Zhong Yi Wan, Leonardo Zepeda-Núñez, Anudhyan Boral, Fei Sha
We present a data-driven, space-time continuous framework to learn surrogate models for complex physical systems described by advection-dominated partial differential equations.
2 code implementations • 1 Jul 2022 • Gideon Dresdner, Dmitrii Kochkov, Peter Norgaard, Leonardo Zepeda-Núñez, Jamie A. Smith, Michael P. Brenner, Stephan Hoyer
We build upon Fourier-based spectral methods, which are known to be more efficient than other numerical schemes for simulating PDEs with smooth and periodic solutions.
no code implementations • 2 Jun 2021 • Matthew Li, Laurent Demanet, Leonardo Zepeda-Núñez
We propose an end-to-end deep learning framework that comprehensively solves the inverse wave scattering problem across all length scales.
no code implementations • 24 Nov 2020 • Matthew Li, Laurent Demanet, Leonardo Zepeda-Núñez
We introduce an end-to-end deep learning architecture called the wide-band butterfly network (WideBNet) for approximating the inverse scattering map from wide-band scattering data.
1 code implementation • 11 Oct 2020 • Yifan Peng, Lin Lin, Lexing Ying, Leonardo Zepeda-Núñez
We showcase this framework by introducing a neural network architecture that combines LRC-layers with short-range convolutional layers to accurately learn the energy and force associated with a $N$-body potential.
no code implementations • 24 Feb 2020 • Jiefu Zhang, Leonardo Zepeda-Núñez, Yuan YAO, Lin Lin
When such structural information is not available, and we may only use a dense neural network, the optimization procedure to find the sparse network embedded in the dense network is similar to finding the needle in a haystack, using a given number of samples of the function.
no code implementations • 27 Nov 2019 • Leonardo Zepeda-Núñez, Yixiao Chen, Jiefu Zhang, Weile Jia, Linfeng Zhang, Lin Lin
By directly targeting at the self-consistent electron density, we demonstrate that the adapted network architecture, called the Deep Density, can effectively represent the electron density as the linear combination of contributions from many local clusters.