Search Results for author: Yihang Gao

Found 7 papers, 1 papers with code

Approximating Probability Distributions by using Wasserstein Generative Adversarial Networks

no code implementations18 Mar 2021 Yihang Gao, Michael K. Ng, Mingjie Zhou

Studied here are Wasserstein generative adversarial networks (WGANs) with GroupSort neural networks as their discriminators.

Wasserstein Generative Adversarial Uncertainty Quantification in Physics-Informed Neural Networks

1 code implementation30 Aug 2021 Yihang Gao, Michael K. Ng

In this paper, we study a physics-informed algorithm for Wasserstein Generative Adversarial Networks (WGANs) for uncertainty quantification in solutions of partial differential equations.

Uncertainty Quantification

HessianFR: An Efficient Hessian-based Follow-the-Ridge Algorithm for Minimax Optimization

no code implementations23 May 2022 Yihang Gao, Huafeng Liu, Michael K. Ng, Mingjie Zhou

Wide applications of differentiable two-player sequential games (e. g., image generation by GANs) have raised much interest and attention of researchers to study efficient and fast algorithms.

Image Generation

Approximate Secular Equations for the Cubic Regularization Subproblem

no code implementations27 Sep 2022 Yihang Gao, Man-Chung Yue, Michael K. Ng

In this paper, we propose and analyze a novel CRS solver based on an approximate secular equation, which requires only some of the Hessian eigenvalues and is therefore much more efficient.

A Momentum Accelerated Adaptive Cubic Regularization Method for Nonconvex Optimization

no code implementations12 Oct 2022 Yihang Gao, Michael K. Ng

The cubic regularization method (CR) and its adaptive version (ARC) are popular Newton-type methods in solving unconstrained non-convex optimization problems, due to its global convergence to local minima under mild conditions.

regression

SVD-PINNs: Transfer Learning of Physics-Informed Neural Networks via Singular Value Decomposition

no code implementations16 Nov 2022 Yihang Gao, Ka Chun Cheung, Michael K. Ng

Physics-informed neural networks (PINNs) have attracted significant attention for solving partial differential equations (PDEs) in recent years because they alleviate the curse of dimensionality that appears in traditional methods.

Transfer Learning

On the Expressive Power of a Variant of the Looped Transformer

no code implementations21 Feb 2024 Yihang Gao, Chuanyang Zheng, Enze Xie, Han Shi, Tianyang Hu, Yu Li, Michael K. Ng, Zhenguo Li, Zhaoqiang Liu

Previous works try to explain this from the expressive power and capability perspectives that standard transformers are capable of performing some algorithms.

Cannot find the paper you are looking for? You can Submit a new open access paper.