no code implementations • 2 Jun 2023 • Yuesheng Xu, Haizhang Zhang
We consider deep neural networks with a Lipschitz continuous activation function and with weight matrices of variable widths.
no code implementations • 28 Nov 2022 • Jia-Qi Lin, Man-Sheng Chen, Xi-Ran Zhu, Chang-Dong Wang, Haizhang Zhang
Specifically, the proposed method introduces the Specific Information Reconstruction (SIR) module to disentangle the explorations of the consensus and specific information from multiple views, which enables GCN to capture the more essential low-level representations.
no code implementations • 13 May 2022 • Wentao Huang, Haizhang Zhang
By studying the two series, we establish a sufficient condition for pointwise convergence of ResNets.
no code implementations • 13 May 2022 • Wentao Huang, Yuesheng Xu, Haizhang Zhang
In this current work, we study the convergence of deep neural networks as the depth tends to infinity for two other important activation functions: the leaky ReLU and the sigmoid function.
no code implementations • 28 Sep 2021 • Yuesheng Xu, Haizhang Zhang
Based on the conditions, we present sufficient conditions for piecewise convergence of general deep ReLU networks with increasing widths, and as well as pointwise convergence of deep ReLU convolutional neural networks.
no code implementations • 27 Jul 2021 • Yuesheng Xu, Haizhang Zhang
We explore convergence of deep neural networks with the popular ReLU activation function, as the depth of the networks tends to infinity.
no code implementations • 16 Jun 2021 • Yunfei Yang, Haizhang Zhang
Specifically, we show that one can recover a band-limited function by Gaussian or hyper-Gaussian regularized nonuniform sampling series with an exponential convergence rate.
no code implementations • 1 Apr 2021 • Jie Gui, Haizhang Zhang
Multi-task learning is an important trend of machine learning in facing the era of artificial intelligence and big data.
no code implementations • 4 Jan 2019 • Rongrong Lin, Haizhang Zhang, Jun Zhang
We explore a generic definition of RKBS and the reproducing kernel for RKBS that is independent of construction.
no code implementations • 21 Oct 2013 • Benxun Wang, Haizhang Zhang
The main purpose of the note is to give a delicate discussion on the universalities of weighted polynomial kernels.