Search Results for author: Zhenquan Lin

Found 2 papers, 2 papers with code

Compact Model Training by Low-Rank Projection with Energy Transfer

1 code implementation12 Apr 2022 Kailing Guo, Zhenquan Lin, Xiaofen Xing, Fang Liu, Xiangmin Xu

In this paper, we devise a new training method, low-rank projection with energy transfer (LRPET), that trains low-rank compressed networks from scratch and achieves competitive performance.

Low-rank compression

Weight Evolution: Improving Deep Neural Networks Training through Evolving Inferior Weight Values

1 code implementation9 Oct 2021 Zhenquan Lin, Kailing Guo, Xiaofen Xing, Xiangmin Xu

Comprehensive experiments show that WE outperforms the other reactivation methods and plug-in training methods with typical convolutional neural networks, especially lightweight networks.

Cannot find the paper you are looking for? You can Submit a new open access paper.