Search Results for author: Bahador Bahmani

Found 8 papers, 1 papers with code

Neural Chaos: A Spectral Stochastic Neural Operator

no code implementations17 Feb 2025 Bahador Bahmani, Ioannis G. Kevrekidis, Michael D. Shields

To achieve this, we propose an algorithm that identifies NN-parameterized basis functions in a purely data-driven manner, without any prior assumptions about the joint distribution of the random variables involved, whether independent or dependent.

Uncertainty Quantification

A Resolution Independent Neural Operator

no code implementations17 Jul 2024 Bahador Bahmani, Somdatta Goswami, Ioannis G. Kevrekidis, Michael D. Shields

Similarly, the dictionary learning algorithms identify basis functions for output data, defining a new neural operator architecture: the Resolution Independent Neural Operator (RINO).

Dictionary Learning Operator learning

A review on data-driven constitutive laws for solids

no code implementations6 May 2024 Jan Niklas Fuhg, Govinda Anantha Padmanabha, Nikolaos Bouklas, Bahador Bahmani, WaiChing Sun, Nikolaos N. Vlassis, Moritz Flaschel, Pietro Carrara, Laura De Lorenzis

This review article highlights state-of-the-art data-driven techniques to discover, encode, surrogate, or emulate constitutive laws that describe the path-independent and path-dependent response of solids.

Discovering interpretable elastoplasticity models via the neural polynomial method enabled symbolic regressions

no code implementations24 Jul 2023 Bahador Bahmani, Hyoung Suk Suh, WaiChing Sun

A post-processing step is then used to re-interpret the set of single-variable neural network mapping functions into mathematical form through symbolic regression.

regression Symbolic Regression

Training multi-objective/multi-task collocation physics-informed neural network with student/teachers transfer learnings

no code implementations24 Jul 2021 Bahador Bahmani, WaiChing Sun

This paper presents a PINN training framework that employs (1) pre-training steps that accelerates and improve the robustness of the training of physics-informed neural network with auxiliary data stored in point clouds, (2) a net-to-net knowledge transfer algorithm that improves the weight initialization of the neural network and (3) a multi-objective optimization algorithm that may improve the performance of a physical-informed neural network with competing constraints.

Multi-Task Learning

Equivariant geometric learning for digital rock physics: estimating formation factor and effective permeability tensors from Morse graph

no code implementations12 Apr 2021 Chen Cai, Nikolaos Vlassis, Lucas Magee, Ran Ma, Zeyu Xiong, Bahador Bahmani, Teng-Fong Wong, Yusu Wang, WaiChing Sun

Comparisons among predictions inferred from training the CNN and those from graph convolutional neural networks (GNN) with and without the equivariant constraint indicate that the equivariant graph neural network seems to perform better than the CNN and GNN without enforcing equivariant constraints.

Graph Neural Network

Cannot find the paper you are looking for? You can Submit a new open access paper.