Search Results for author: Hayden K. -H. So

Found 5 papers, 3 papers with code

Mix and Match: A Novel FPGA-Centric Deep Neural Network Quantization Framework

no code implementations8 Dec 2020 Sung-En Chang, Yanyu Li, Mengshu Sun, Runbin Shi, Hayden K. -H. So, Xuehai Qian, Yanzhi Wang, Xue Lin

Unlike existing methods that use the same quantization scheme for all weights, we propose the first solution that applies different quantization schemes for different rows of the weight matrix.

Edge-computing Model Compression +1

NITI: Training Integer Neural Networks Using Integer-only Arithmetic

1 code implementation28 Sep 2020 Maolin Wang, Seyedramin Rasoulinezhad, Philip H. W. Leong, Hayden K. -H. So

While integer arithmetic has been widely adopted for improved performance in deep quantized neural network inference, training remains a task primarily executed using floating point arithmetic.

Dynamic Sparse Training: Find Efficient Sparse Network From Scratch With Trainable Masked Layers

1 code implementation ICLR 2020 Junjie Liu, Zhe Xu, Runbin Shi, Ray C. C. Cheung, Hayden K. -H. So

We present a novel network pruning algorithm called Dynamic Sparse Training that can jointly find the optimal network parameters and sparse network structure in a unified optimization process with trainable pruning thresholds.

Network Pruning

High-dimensional Dense Residual Convolutional Neural Network for Light Field Reconstruction

1 code implementation3 Oct 2019 Nan Meng, Hayden K. -H. So, Xing Sun, Edmund Y. Lam

We consider the problem of high-dimensional light field reconstruction and develop a learning-based framework for spatial and angular super-resolution.

Super-Resolution Vocal Bursts Intensity Prediction

Consistency Analysis for the Doubly Stochastic Dirichlet Process

no code implementations24 May 2016 Xing Sun, Nelson H. C. Yung, Edmund Y. Lam, Hayden K. -H. So

This technical report proves components consistency for the Doubly Stochastic Dirichlet Process with exponential convergence of posterior probability.

Cannot find the paper you are looking for? You can Submit a new open access paper.