Search Results for author: Zhulin Liu

Found 4 papers, 0 papers with code

Siamese Labels Auxiliary Learning

no code implementations27 Feb 2021 Wenrui Gan, Zhulin Liu, C. L. Philip Chen, Tong Zhang

In general, the main work of this paper include: (1) propose SiLa Learning, which improves the performance of common models without increasing test parameters; (2) compares SiLa with DML and proves that SiLa can improve the generalization of the model; (3) SiLa is applied to Dynamic Neural Networks, and proved that SiLa can be used for various types of network structures.

Auxiliary Learning

Reducing the Computational Complexity of Pseudoinverse for the Incremental Broad Learning System on Added Inputs

no code implementations17 Oct 2019 Hufei Zhu, Zhulin Liu, C. L. Philip Chen, Yanyang Liang

Specifically, when q > k, the proposed algorithm computes only a k * k matrix inverse, instead of a q * q matrix inverse in the existing algorithm.

Incremental Learning

Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture

no code implementations IEEE Transactions on Neural Networks and Learning Systems 2017 C. L. Philip Chen, Zhulin Liu

The BLS is established in the form of a flat network, where the original inputs are transferred and placed as “mapped features” in feature nodes and the structure is expanded in wide sense in the “enhancement nodes.” The incremental learning algorithms are developed for fast remodeling in broad expansion without a retraining process if the network deems to be expanded.

Incremental Learning Object Recognition

Approximation learning methods of Harmonic Mappings in relation to Hardy Spaces

no code implementations24 May 2017 Zhulin Liu, C. L. Philip Chen

A new Hardy space Hardy space approach of Dirichlet type problem based on Tikhonov regularization and Reproducing Hilbert kernel space is discussed in this paper, which turns out to be a typical extremal problem located on the upper upper-high complex plane.

Relation

Cannot find the paper you are looking for? You can Submit a new open access paper.