no code implementations • 17 Feb 2024 • Jeremiah Hauth, Cosmin Safta, Xun Huan, Ravi G. Patel, Reese E. Jones
In this work we present comparisons of the parametric uncertainty quantification of neural networks modeling complex spatial-temporal processes with Hamiltonian Monte Carlo and Stein variational gradient descent and its projected variant.
no code implementations • 22 Apr 2022 • Ravi G. Patel, Indu Manickam, Myoungkyu Lee, Mamikon Gulian
We propose error-in-variables (EiV) models for two operator regression methods, MOR-Physics and DeepONet, and demonstrate that these new models reduce bias in the presence of noisy independent variables for a variety of operator learning problems.
no code implementations • 27 Jan 2021 • Kookjin Lee, Nathaniel A. Trask, Ravi G. Patel, Mamikon A. Gulian, Eric C. Cyr
Approximation theorists have established best-in-class optimal approximation rates of deep neural networks by utilizing their ability to simultaneously emulate partitions of unity and monomials.
1 code implementation • 25 Sep 2020 • Ravi G. Patel, Nathaniel A. Trask, Mitchell A. Wood, Eric C. Cyr
The application of deep learning toward discovery of data-driven models requires careful application of inductive biases to obtain a description of physics which is both accurate and robust.
no code implementations • 17 Jun 2020 • Ravi G. Patel, Nathaniel A. Trask, Mamikon A. Gulian, Eric C. Cyr
By alternating between a second-order method to find globally optimal parameters for the linear layer and gradient descent to train the hidden layers, we ensure an optimal fit of the adaptive basis to data throughout training.
no code implementations • 10 Dec 2019 • Eric C. Cyr, Mamikon A. Gulian, Ravi G. Patel, Mauro Perego, Nathaniel A. Trask
Motivated by the gap between theoretical optimal approximation rates of deep neural networks (DNNs) and the accuracy realized in practice, we seek to improve the training of DNNs.
2 code implementations • 7 Sep 2019 • Nathaniel Trask, Ravi G. Patel, Ben J. Gross, Paul J. Atzberger
Data fields sampled on irregularly spaced points arise in many applications in the sciences and engineering.
no code implementations • 19 Oct 2018 • Ravi G. Patel, Olivier Desjardins
The method parametrizes the spatial operator with neural networks and Fourier transforms such that it can fit a class of nonlinear operators without needing a library of a priori selected operators.