Learning Bayes-optimal dendritic opinion pooling

27 Apr 2021  ·  Jakob Jordan, João Sacramento, Willem A. M. Wybo, Mihai A. Petrovici, Walter Senn ·

In functional network models, neurons are commonly conceptualized as linearly summing presynaptic inputs before applying a non-linear gain function to produce output activity. In contrast, synaptic coupling between neurons in the central nervous system is regulated by dynamic permeabilities of ion channels. So far, the computational role of these membrane conductances remains unclear and is often considered an artifact of the biological substrate. Here we demonstrate that conductance-based synaptic coupling allow neurons to represent, process and learn uncertainties. We suggest that membrane potentials and conductances on dendritic branches code opinions with associated reliabilities. The biophysics of the membrane combines these opinions by taking account their reliabilities, and the soma thus acts as a decision maker. We derive a gradient-based plasticity rule, allowing neurons to learn desired target distributions and weight synaptic inputs by their relative reliabilities. Our theory explains various experimental findings on the system and single-cell level related to multi-sensory integration, and makes testable predictions on dendritic integration and synaptic plasticity.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here