1 code implementation • 6 Sep 2020 • Haijun Yu, Xinyuan Tian, Weinan E, Qianxiao Li
We further apply this method to study Rayleigh-Benard convection and learn Lorenz-like low dimensional autonomous reduced order models that capture both qualitative and quantitative properties of the underlying dynamics.
2 code implementations • 7 Nov 2019 • Shanshan Tang, Bo Li, Haijun Yu
In this paper, we propose a new and more stable way to construct deep RePU neural networks based on Chebyshev polynomial approximations.
no code implementations • 9 Sep 2019 • Bo Li, Shanshan Tang, Haijun Yu
In this paper, we construct deep neural networks with rectified power units (RePU), which can give better approximations for smooth functions.
no code implementations • 6 May 2019 • Weiwen Wu, Haijun Yu, Peijun Chen, Fulin Luo, Fenglin Liu, Qian Wang, Yining Zhu, Yanbo Zhang, Jian Feng, Hengyong Yu
Second, we employ the direct inversion (DI) method to obtain initial material decomposition results, and a set of image patches are extracted from the mode-1 unfolding of normalized material image tensor to train a united dictionary by the K-SVD technique.
no code implementations • 14 Mar 2019 • Bo Li, Shanshan Tang, Haijun Yu
Comparing to the results on ReLU network, the sizes of RePU networks required to approximate functions in Sobolev space and Korobov space with an error tolerance $\varepsilon$, by our constructive proofs, are in general $\mathcal{O}(\log \frac{1}{\varepsilon})$ times smaller than the sizes of corresponding ReLU networks.
Numerical Analysis
no code implementations • 4 Aug 2018 • Shanshan Tang, Haijun Yu
While it is believed that denoising is not always necessary in many big data applications, we show in this paper that denoising is helpful in urban traffic analysis by applying the method of bounded total variation denoising to the urban road traffic prediction and clustering problem.