no code implementations • 3 Dec 2023 • Huanyi Qin, Denis Akhiyarov, Sophie Loehle, Kenneth Chiu, Mauricio Araya-Polo
The proposed models are trained with a proprietary antioxidant dataset.
no code implementations • 6 Nov 2023 • Shehtab Zaman, Denis Akhiyarov, Mauricio Araya-Polo, Kenneth Chiu
Machine learning and especially deep learning has had an increasing impact on molecule and materials design.
no code implementations • 31 Oct 2023 • Maayan Gelboim, Amir Adler, Yen Sun, Mauricio Araya-Polo
We consider the problem of 3D seismic inversion from pre-stack data using a very small number of seismic sources.
no code implementations • 24 Sep 2023 • Adrian Celaya, Mauricio Araya-Polo
We introduce a fully 3D, deep learning-based approach for the joint inversion of time-lapse surface gravity and seismic data for reconstructing subsurface density and velocity models.
no code implementations • 16 Jun 2023 • Xin Ju, François P. Hamon, Gege Wen, Rayan Kanfar, Mauricio Araya-Polo, Hamdi A. Tchelepi
Accurately capturing the impact of faults on CO$_2$ plume migration remains a challenge for many existing deep learning surrogate models based on Convolutional Neural Networks (CNNs) or Neural Operators.
1 code implementation • 15 Nov 2022 • Shehtab Zaman, Ethan Ferguson, Cecile Pereira, Denis Akhiyarov, Mauricio Araya-Polo, Kenneth Chiu
We propose $\textit{ParticleGrid}$, a SIMD-optimized library for 3D structures, that is designed for deep learning applications and to seamlessly integrate with deep learning frameworks.
no code implementations • 6 Sep 2022 • Adrian Celaya, Bertrand Denel, Yen Sun, Mauricio Araya-Polo, Antony Price
We introduce three algorithms that invert simulated gravity data to 3D subsurface rock/flow properties.
no code implementations • 29 Jul 2022 • Maayan Gelboim, Amir Adler, Yen Sun, Mauricio Araya-Polo
This paper presents a deep learning solution for the reconstruction of realistic 3D models in the presence of field noise recorded in seismic surveys.
1 code implementation • 6 Apr 2020 • Christopher Thiele, Mauricio Araya-Polo, Detlef Hohl
Training in supervised deep learning is computationally demanding, and the convergence behavior is usually not fully understood.
no code implementations • NeurIPS 2015 • Charlie Frogner, Chiyuan Zhang, Hossein Mobahi, Mauricio Araya-Polo, Tomaso Poggio
In this paper we develop a loss function for multi-label learning, based on the Wasserstein distance.