no code implementations • 13 May 2022 • Wentao Huang, Yuesheng Xu, Haizhang Zhang
In this current work, we study the convergence of deep neural networks as the depth tends to infinity for two other important activation functions: the leaky ReLU and the sigmoid function.
no code implementations • 13 May 2022 • Wentao Huang, Haizhang Zhang
By studying the two series, we establish a sufficient condition for pointwise convergence of ResNets.
no code implementations • 15 Sep 2021 • Meiyi Li, Wentao Huang, Nengling Tai, Dongliang Duan
With the increasing impact of low inertia due to the high penetration of distributed generation, virtual synchronous generator (VSG) technology has been proposed to improve the stability of the inverter-interfaced distributed generator by providing "virtual inertia".
no code implementations • 24 Apr 2021 • Peng Xie, Wenyuan Tao, Jie Li, Wentao Huang, Siming Chen
The core of the approach is a subset embedding network (SEN) that represents a group of subsets as uniformly-formatted embeddings.
no code implementations • 1 Feb 2017 • Wentao Huang, Xin Huang, Kechen Zhang
We have developed an efficient information-maximization method for computing the optimal shapes of tuning curves of sensory neurons by optimizing the parameters of the underlying feedforward network model.
no code implementations • 7 Nov 2016 • Wentao Huang, Kechen Zhang
Starting from the initial solution, an efficient algorithm based on gradient descent of the final objective function is proposed to learn representations from the input datasets, and the method works for complete, overcomplete, and undercomplete bases.
no code implementations • 4 Nov 2016 • Wentao Huang, Kechen Zhang
While Shannon's mutual information has widespread applications in many disciplines, for practical applications it is often difficult to calculate its value accurately for high-dimensional variables because of the curse of dimensionality.