Flexible Transmitter Network

8 Apr 2020  ·  Shao-Qun Zhang, Zhi-Hua Zhou ·

Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons. In this paper, we propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity. The FT model employs a pair of parameters to model the transmitters between neurons and puts up a neuron-exclusive variable to record the regulated neurotrophin density, which leads to the formulation of the FT model as a two-variable two-valued function, taking the commonly-used MP neuron model as its special case. This modeling manner makes the FT model not only biologically more realistic, but also capable of handling complicated data, even time series. To exhibit its power and potential, we present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture taking the FT model as the basic building block. FTNet allows gradient calculation and can be implemented by an improved back-propagation algorithm in the complex-valued domain. Experiments on a board range of tasks show the superiority of the proposed FTNet. This study provides an alternative basic building block in neural networks and exhibits the feasibility of developing artificial neural networks with neuronal plasticity.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here