no code implementations • CVPR 2021 • Miao Yin, Yang Sui, Siyu Liao, Bo Yuan
Notably, on CIFAR-100, with 2. 3X and 2. 4X compression ratios, our models have 1. 96% and 2. 21% higher top-1 accuracy than the original ResNet-20 and ResNet-32, respectively.
no code implementations • CVPR 2021 • Miao Yin, Siyu Liao, Xiao-Yang Liu, Xiaodong Wang, Bo Yuan
Although various prior works have been proposed to reduce the RNN model sizes, executing RNN models in resource-restricted environments is still a very challenging problem.
no code implementations • 28 Mar 2021 • Xiao Zang, Yi Xie, Siyu Liao, Jie Chen, Bo Yuan
In this paper, we, for the first time, perform systematic investigation on noise injection-based regularization for point cloud-domain DNNs.
no code implementations • 8 Feb 2021 • Siyu Liao, Chunhua Deng, Miao Yin, Bo Yuan
Recently deep neural networks have been successfully applied in channel coding to improve the decoding performance.
no code implementations • 9 May 2020 • Miao Yin, Siyu Liao, Xiao-Yang Liu, Xiaodong Wang, Bo Yuan
Recurrent Neural Networks (RNNs) have been widely used in sequence analysis and modeling.
no code implementations • 23 Apr 2020 • Chunhua Deng, Siyu Liao, Yi Xie, Keshab K. Parhi, Xuehai Qian, Bo Yuan
On the other hand, the recent structured matrix-based approach (i. e., CirCNN) is limited by the relatively complex arithmetic computation (i. e., FFT), less flexible compression ratio, and its inability to fully utilize input sparsity.
no code implementations • 11 Jan 2020 • Siyu Liao, Jie Chen, Yanzhi Wang, Qinru Qiu, Bo Yuan
Continuous representation of words is a standard component in deep learning-based NLP models.
no code implementations • 16 Dec 2019 • Huy Phan, Yi Xie, Siyu Liao, Jie Chen, Bo Yuan
In addition, CAG exhibits high transferability across different DNN classifier models in black-box attack scenario by introducing random dropout in the process of generating perturbations.
no code implementations • 28 Feb 2019 • Siyu Liao, Zhe Li, Liang Zhao, Qinru Qiu, Yanzhi Wang, Bo Yuan
Deep neural networks (DNNs), especially deep convolutional neural networks (CNNs), have emerged as the powerful technique in various machine learning applications.
no code implementations • 18 Feb 2018 • Yanzhi Wang, Caiwen Ding, Zhe Li, Geng Yuan, Siyu Liao, Xiaolong Ma, Bo Yuan, Xuehai Qian, Jian Tang, Qinru Qiu, Xue Lin
Hardware accelerations of deep learning systems have been extensively investigated in industry and academia.
no code implementations • 29 Aug 2017 • Caiwen Ding, Siyu Liao, Yanzhi Wang, Zhe Li, Ning Liu, Youwei Zhuo, Chao Wang, Xuehai Qian, Yu Bai, Geng Yuan, Xiaolong Ma, Yi-Peng Zhang, Jian Tang, Qinru Qiu, Xue Lin, Bo Yuan
As the size of DNNs continues to grow, it is critical to improve the energy efficiency and performance while maintaining accuracy.
no code implementations • ICML 2017 • Liang Zhao, Siyu Liao, Yanzhi Wang, Zhe Li, Jian Tang, Victor Pan, Bo Yuan
Recently low displacement rank (LDR) matrices, or so-called structured matrices, have been proposed to compress large-scale neural networks.