Computing threshold functions using dendrites

10 Nov 2016  ·  Romain Cazé, Bartozs Teleńczuk, Alain Destexhe ·

Neurons, modeled as linear threshold unit (LTU), can in theory compute all thresh- old functions. In practice, however, some of these functions require synaptic weights of arbitrary large precision. We show here that dendrites can alleviate this requirement. We introduce here the non-Linear Threshold Unit (nLTU) that integrates synaptic input sub-linearly within distinct subunits to take into account local saturation in dendrites. We systematically search parameter space of the nTLU and TLU to compare them. Firstly, this shows that the nLTU can compute all threshold functions with smaller precision weights than the LTU. Secondly, we show that a nLTU can compute significantly more functions than a LTU when an input can only make a single synapse. This work paves the way for a new generation of network made of nLTU with binary synapses.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here