Exponential discretization of weights of neural network connections in pre-trained neural networks

3 Feb 2020Magomed Yu. MalsagovEmil M. KhayrovMaria M. PushkarevaIakov M. Karandashev

To reduce random access memory (RAM) requirements and to increase speed of recognition algorithms we consider a weight discretization problem for trained neural networks. We show that an exponential discretization is preferable to a linear discretization since it allows one to achieve the same accuracy when the number of bits is 1 or 2 less... (read more)

PDF Abstract

Code


No code implementations yet. Submit your code now

Tasks


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.