Search Results for author: Wentao Huang

Found 7 papers, 0 papers with code

Convergence of Deep Neural Networks with General Activation Functions and Pooling

no code implementations13 May 2022 Wentao Huang, Yuesheng Xu, Haizhang Zhang

In this current work, we study the convergence of deep neural networks as the depth tends to infinity for two other important activation functions: the leaky ReLU and the sigmoid function.

Convergence Analysis of Deep Residual Networks

no code implementations13 May 2022 Wentao Huang, Haizhang Zhang

By studying the two series, we establish a sufficient condition for pointwise convergence of ResNets.

Virtual Inertia Control of the Virtual Synchronous Generator: A Review

no code implementations15 Sep 2021 Meiyi Li, Wentao Huang, Nengling Tai, Dongliang Duan

With the increasing impact of low inertia due to the high penetration of distributed generation, virtual synchronous generator (VSG) technology has been proposed to improve the stability of the inverter-interfaced distributed generator by providing "virtual inertia".

Exploring Multi-dimensional Data via Subset Embedding

no code implementations24 Apr 2021 Peng Xie, Wenyuan Tao, Jie Li, Wentao Huang, Siming Chen

The core of the approach is a subset embedding network (SEN) that represents a group of subsets as uniformly-formatted embeddings.

Information-theoretic interpretation of tuning curves for multiple motion directions

no code implementations1 Feb 2017 Wentao Huang, Xin Huang, Kechen Zhang

We have developed an efficient information-maximization method for computing the optimal shapes of tuning curves of sensory neurons by optimizing the parameters of the underlying feedforward network model.

An Information-Theoretic Framework for Fast and Robust Unsupervised Learning via Neural Population Infomax

no code implementations7 Nov 2016 Wentao Huang, Kechen Zhang

Starting from the initial solution, an efficient algorithm based on gradient descent of the final objective function is proposed to learn representations from the input datasets, and the method works for complete, overcomplete, and undercomplete bases.

Representation Learning

Information-Theoretic Bounds and Approximations in Neural Population Coding

no code implementations4 Nov 2016 Wentao Huang, Kechen Zhang

While Shannon's mutual information has widespread applications in many disciplines, for practical applications it is often difficult to calculate its value accurately for high-dimensional variables because of the curse of dimensionality.

Dimensionality Reduction

Cannot find the paper you are looking for? You can Submit a new open access paper.