Search Results for author: Haijun Yu

Found 6 papers, 2 papers with code

OnsagerNet: Learning Stable and Interpretable Dynamics using a Generalized Onsager Principle

1 code implementation6 Sep 2020 Haijun Yu, Xinyuan Tian, Weinan E, Qianxiao Li

We further apply this method to study Rayleigh-Benard convection and learn Lorenz-like low dimensional autonomous reduced order models that capture both qualitative and quantitative properties of the underlying dynamics.

ChebNet: Efficient and Stable Constructions of Deep Neural Networks with Rectified Power Units using Chebyshev Approximations

2 code implementations7 Nov 2019 Shanshan Tang, Bo Li, Haijun Yu

In this paper, we propose a new and more stable way to construct deep RePU neural networks based on Chebyshev polynomial approximations.

PowerNet: Efficient Representations of Polynomials and Smooth Functions by Deep Neural Networks with Rectified Power Units

no code implementations9 Sep 2019 Bo Li, Shanshan Tang, Haijun Yu

In this paper, we construct deep neural networks with rectified power units (RePU), which can give better approximations for smooth functions.

DLIMD: Dictionary Learning based Image-domain Material Decomposition for spectral CT

no code implementations6 May 2019 Weiwen Wu, Haijun Yu, Peijun Chen, Fulin Luo, Fenglin Liu, Qian Wang, Yining Zhu, Yanbo Zhang, Jian Feng, Hengyong Yu

Second, we employ the direct inversion (DI) method to obtain initial material decomposition results, and a set of image patches are extracted from the mode-1 unfolding of normalized material image tensor to train a united dictionary by the K-SVD technique.

Computed Tomography (CT) Dictionary Learning +1

Better Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power Units

no code implementations14 Mar 2019 Bo Li, Shanshan Tang, Haijun Yu

Comparing to the results on ReLU network, the sizes of RePU networks required to approximate functions in Sobolev space and Korobov space with an error tolerance $\varepsilon$, by our constructive proofs, are in general $\mathcal{O}(\log \frac{1}{\varepsilon})$ times smaller than the sizes of corresponding ReLU networks.

Numerical Analysis

Application of Bounded Total Variation Denoising in Urban Traffic Analysis

no code implementations4 Aug 2018 Shanshan Tang, Haijun Yu

While it is believed that denoising is not always necessary in many big data applications, we show in this paper that denoising is helpful in urban traffic analysis by applying the method of bounded total variation denoising to the urban road traffic prediction and clustering problem.

Denoising Traffic Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.