Search Results for author: Zhengyun Ji

Found 3 papers, 1 papers with code

The Synthesis of XNOR Recurrent Neural Networks with Stochastic Logic

no code implementations NeurIPS 2019 Arash Ardakani, Zhengyun Ji, Amir Ardakani, Warren Gross

The emergence of XNOR networks seek to reduce the model size and computational cost of neural networks for their deployment on specialized hardware requiring real-time processes with limited hardware resources.

Quantization

Learning to Skip Ineffectual Recurrent Computations in LSTMs

no code implementations9 Nov 2018 Arash Ardakani, Zhengyun Ji, Warren J. Gross

This observation suggests that a large fraction of the recurrent computations are ineffectual and can be avoided to speed up the process during the inference as they involve noncontributory multiplications/accumulations with zero-valued states.

Learning Recurrent Binary/Ternary Weights

1 code implementation ICLR 2019 Arash Ardakani, Zhengyun Ji, Sean C. Smithson, Brett H. Meyer, Warren J. Gross

On the software side, we evaluate the performance (in terms of accuracy) of our method using long short-term memories (LSTMs) on various sequential models including sequence classification and language modeling.

Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.