Search Results for author: Robert M. Kirby

Found 21 papers, 6 papers with code

Complexity-Aware Deep Symbolic Regression with Robust Risk-Seeking Policy Gradients

no code implementations10 Jun 2024 Zachary Bastiani, Robert M. Kirby, Jacob Hochhalter, Shandian Zhe

This paper proposes a novel deep symbolic regression approach to enhance the robustness and interpretability of data-driven mathematical expression discovery.

regression Symbolic Regression

Polynomial-Augmented Neural Networks (PANNs) with Weak Orthogonality Constraints for Enhanced Function and PDE Approximation

no code implementations4 Jun 2024 Madison Cooley, Shandian Zhe, Robert M. Kirby, Varun Shankar

We present polynomial-augmented neural networks (PANNs), a novel machine learning architecture that combines deep neural networks (DNNs) with a polynomial approximant.


Kolmogorov n-Widths for Multitask Physics-Informed Machine Learning (PIML) Methods: Towards Robust Metrics

no code implementations16 Feb 2024 Michael Penwarden, Houman Owhadi, Robert M. Kirby

This topic encompasses a broad array of methods and models aimed at solving a single or a collection of PDE problems, called multitask learning.

Physics-informed machine learning

Equation Discovery with Bayesian Spike-and-Slab Priors and Efficient Kernels

1 code implementation9 Oct 2023 Da Long, Wei W. Xing, Aditi S. Krishnapriyan, Robert M. Kirby, Shandian Zhe, Michael W. Mahoney

To overcome the computational challenge of kernel regression, we place the function values on a mesh and induce a Kronecker product construction, and we use tensor algebra to enable efficient computation and optimization.

Equation Discovery regression +1

Neural Operator Learning for Ultrasound Tomography Inversion

1 code implementation6 Apr 2023 Haocheng Dai, Michael Penwarden, Robert M. Kirby, Sarang Joshi

Neural operator learning as a means of mapping between complex function spaces has garnered significant attention in the field of computational science and engineering (CS&E).

Operator learning

Deep neural operators can serve as accurate surrogates for shape optimization: A case study for airfoils

no code implementations2 Feb 2023 Khemraj Shukla, Vivek Oommen, Ahmad Peyvan, Michael Penwarden, Luis Bravo, Anindya Ghoshal, Robert M. Kirby, George Em Karniadakis

Deep neural operators, such as DeepONets, have changed the paradigm in high-dimensional nonlinear regression from function regression to (differential) operator regression, paving the way for significant changes in computational engineering applications.


Batch Multi-Fidelity Active Learning with Budget Constraints

no code implementations23 Oct 2022 Shibo Li, Jeff M. Phillips, Xin Yu, Robert M. Kirby, Shandian Zhe

However, this method only queries at one pair of fidelity and input at a time, and hence has a risk to bring in strongly correlated examples to reduce the learning efficiency.

Active Learning Diversity

Momentum Transformer: Closing the Performance Gap Between Self-attention and Its Linearization

no code implementations1 Aug 2022 Tan Nguyen, Richard G. Baraniuk, Robert M. Kirby, Stanley J. Osher, Bao Wang

Transformers have achieved remarkable success in sequence modeling and beyond but suffer from quadratic computational and memory complexities with respect to the length of the input sequence.

Image Generation Machine Translation

Adaptive Self-supervision Algorithms for Physics-informed Neural Networks

1 code implementation8 Jul 2022 Shashank Subramanian, Robert M. Kirby, Michael W. Mahoney, Amir Gholami

We find that training vanilla PINNs for these problems can result in up to 70% prediction error in the solution, especially in the regime of low collocation points.

Infinite-Fidelity Coregionalization for Physical Simulation

no code implementations1 Jul 2022 Shibo Li, Zheng Wang, Robert M. Kirby, Shandian Zhe

Our model can interpolate and/or extrapolate the predictions to novel fidelities, which can be even higher than the fidelities of training data.

Gaussian Processes

Machine Learning in Heterogeneous Porous Materials

no code implementations4 Feb 2022 Martha D'Eli, Hang Deng, Cedric Fraces, Krishna Garikipati, Lori Graham-Brady, Amanda Howard, Geoerge Karniadakid, Vahid Keshavarzzadeh, Robert M. Kirby, Nathan Kutz, Chunhui Li, Xing Liu, Hannah Lu, Pania Newell, Daniel O'Malley, Masa Prodanovic, Gowri Srinivasan, Alexandre Tartakovsky, Daniel M. Tartakovsky, Hamdi Tchelepi, Bozo Vazic, Hari Viswanathan, Hongkyu Yoon, Piotr Zarzycki

The "Workshop on Machine learning in heterogeneous porous materials" brought together international scientific communities of applied mathematics, porous media, and material sciences with experts in the areas of heterogeneous materials, machine learning (ML) and applied mathematics to identify how ML can advance materials research.

BIG-bench Machine Learning

A Metalearning Approach for Physics-Informed Neural Networks (PINNs): Application to Parameterized PDEs

no code implementations26 Oct 2021 Michael Penwarden, Shandian Zhe, Akil Narayan, Robert M. Kirby

Physics-informed neural networks (PINNs) as a means of discretizing partial differential equations (PDEs) are garnering much attention in the Computational Science and Engineering (CS&E) world.

BIG-bench Machine Learning Physics-informed machine learning +1

Characterizing possible failure modes in physics-informed neural networks

2 code implementations NeurIPS 2021 Aditi S. Krishnapriyan, Amir Gholami, Shandian Zhe, Robert M. Kirby, Michael W. Mahoney

We provide evidence that the soft regularization in PINNs, which involves PDE-based differential operators, can introduce a number of subtle problems, including making the problem more ill-conditioned.

Multifidelity Modeling for Physics-Informed Neural Networks (PINNs)

no code implementations25 Jun 2021 Michael Penwarden, Shandian Zhe, Akil Narayan, Robert M. Kirby

Candidates for this approach are simulation methodologies for which there are fidelity differences connected with significant computational cost differences.

A bandit-learning approach to multifidelity approximation

no code implementations29 Mar 2021 Yiming Xu, Vahid Keshavarzzadeh, Robert M. Kirby, Akil Narayan

Multifidelity approximation is an important technique in scientific computation and simulation.

Nektar++: enhancing the capability and application of high-fidelity spectral/$hp$ element methods

no code implementations8 Jun 2019 David Moxey, Chris D. Cantwell, Yan Bao, Andrea Cassinelli, Giacomo Castiglioni, Sehun Chun, Emilia Juda, Ehsan Kazemi, Kilian Lackhove, Julian Marcon, Gianmarco Mengaldo, Douglas Serson, Michael Turner, Hui Xu, Joaquim Peiró, Robert M. Kirby, Spencer J. Sherwin

Nektar++ is an open-source framework that provides a flexible, high-performance and scalable platform for the development of solvers for partial differential equations using the high-order spectral/$hp$ element method.

Mathematical Software Numerical Analysis Numerical Analysis Fluid Dynamics

Cannot find the paper you are looking for? You can Submit a new open access paper.