Search Results for author: Ritchie Zhao

Found 8 papers, 3 papers with code

Precision Gating: Improving Neural Network Efficiency with Dynamic Dual-Precision Activations

1 code implementation ICLR 2020 Yichi Zhang, Ritchie Zhao, Weizhe Hua, Nayun Xu, G. Edward Suh, Zhiru Zhang

The proposed approach is applicable to a variety of DNN architectures and significantly reduces the computational cost of DNN execution with almost no accuracy loss.


OverQ: Opportunistic Outlier Quantization for Neural Network Accelerators

no code implementations13 Oct 2019 Ritchie Zhao, Jordan Dotzel, Zhanqiu Hu, Preslav Ivanov, Christopher De Sa, Zhiru Zhang

Specialized hardware for handling activation outliers can enable low-precision neural networks, but at the cost of nontrivial area overhead.


Improving Neural Network Quantization without Retraining using Outlier Channel Splitting

3 code implementations28 Jan 2019 Ritchie Zhao, Yuwei Hu, Jordan Dotzel, Christopher De Sa, Zhiru Zhang

The majority of existing literature focuses on training quantized DNNs, while this work examines the less-studied topic of quantizing a floating-point model without (re)training.

Language Modelling Neural Network Compression +1

Building Efficient Deep Neural Networks with Unitary Group Convolutions

no code implementations CVPR 2019 Ritchie Zhao, Yuwei Hu, Jordan Dotzel, Christopher De Sa, Zhiru Zhang

UGConvs generalize two disparate ideas in CNN architecture, channel shuffling (i. e. ShuffleNet) and block-circulant networks (i. e. CirCNN), and provide unifying insights that lead to a deeper understanding of each technique.

Binarized Convolutional Neural Networks with Separable Filters for Efficient Hardware Acceleration

no code implementations15 Jul 2017 Jeng-Hau Lin, Tianwei Xing, Ritchie Zhao, Zhiru Zhang, Mani Srivastava, Zhuowen Tu, Rajesh K. Gupta

State-of-the-art convolutional neural networks are enormously costly in both compute and memory, demanding massively parallel GPUs for execution.

Cannot find the paper you are looking for? You can Submit a new open access paper.