no code implementations • 21 Dec 2024 • Govinda Anantha Padmanabha, Cosmin Safta, Nikolaos Bouklas, Reese E. Jones
We propose a Stein variational gradient descent method to concurrently sparsify, train, and provide uncertainty quantification of a complexly parameterized model such as a neural network.
no code implementations • 30 Jun 2024 • Govinda Anantha Padmanabha, Jan Niklas Fuhg, Cosmin Safta, Reese E. Jones, Nikolaos Bouklas
Specifically, $L_0$+SVGD demonstrates superior resilience to noise, the ability to perform well in extrapolated regions, and a faster convergence rate to an optimal solution.
no code implementations • 6 May 2024 • Jan Niklas Fuhg, Govinda Anantha Padmanabha, Nikolaos Bouklas, Bahador Bahmani, WaiChing Sun, Nikolaos N. Vlassis, Moritz Flaschel, Pietro Carrara, Laura De Lorenzis
This review article highlights state-of-the-art data-driven techniques to discover, encode, surrogate, or emulate constitutive laws that describe the path-independent and path-dependent response of solids.
1 code implementation • 8 Mar 2021 • Govinda Anantha Padmanabha, Nicholas Zabaras
In addition, it is challenging to develop accurate surrogate and uncertainty quantification models for high-dimensional problems governed by stochastic multiscale PDEs using limited training data.
1 code implementation • 31 Jul 2020 • Govinda Anantha Padmanabha, Nicholas Zabaras
In this work, we construct a two- and three-dimensional inverse surrogate models consisting of an invertible and a conditional neural network trained in an end-to-end fashion with limited training data.