Search Results for author: Qihong Yang

Found 5 papers, 1 papers with code

Moving Sampling Physics-informed Neural Networks induced by Moving Mesh PDE

no code implementations14 Nov 2023 Yu Yang, Qihong Yang, Yangtao Deng, Qiaolin He

In this work, we propose an end-to-end adaptive sampling neural network (MMPDE-Net) based on the moving mesh method, which can adaptively generate new sampling points by solving the moving mesh PDE.

Neural Networks Based on Power Method and Inverse Power Method for Solving Linear Eigenvalue Problems

1 code implementation22 Sep 2022 Qihong Yang, Yangtao Deng, Yu Yang, Qiaolin He, Shiquan Zhang

In this article, we propose two kinds of neural networks inspired by power method and inverse power method to solve linear eigenvalue problems.

Denoising User-aware Memory Network for Recommendation

no code implementations12 Jul 2021 Zhi Bian, Shaojun Zhou, Hao Fu, Qihong Yang, Zhenqi Sun, Junjie Tang, Guiquan Liu, Kaikui Liu, Xiaolong Li

Specifically, the framework: (i) proposes a feature purification module based on orthogonal mapping, which use the representation of explicit feedback to purify the representation of implicit feedback, and effectively denoise the implicit feedback; (ii) designs a user memory network to model the long-term interests in a fine-grained way by improving the memory network, which is ignored by the existing methods; and (iii) develops a preference-aware interactive representation component to fuse the long-term and short-term interests of users based on gating to understand the evolution of unbiased preferences of users.

Denoising

LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language Understanding

no code implementations14 Dec 2020 Hao Fu, Shaojun Zhou, Qihong Yang, Junjie Tang, Guiquan Liu, Kaikui Liu, Xiaolong Li

In this work, we propose a knowledge distillation method LRC-BERT based on contrastive learning to fit the output of the intermediate layer from the angular distance aspect, which is not considered by the existing distillation methods.

Contrastive Learning Knowledge Distillation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.