Bayesian Confidence Propagation Neural Network (BCPNN) learning rule module in NEST v2.2.2
- 1. KTH Royal Institute of Technology
Description
Bayesian Confidence Propagation Neural Network (BCPNN) is a Hebbian learning rule for spiking neurons inspired by Bayesian statistics is proposed. In this model, synaptic weights and intrinsic currents are adapted on-line upon the arrival of single spikes, which initiate a cascade of temporally interacting memory traces that locally estimate probabilities associated with relative neuronal activation levels. Trace dynamics enable synaptic learning to readily demonstrate a spike-timing dependence, stably return to a set-point over long time scales, and remain competitive despite this stability. Beyond unsupervised learning, linking the traces with an external plasticity-modulating signal enables spike-based reinforcement learning. At the postsynaptic neuron, the traces are represented by an activity-dependent ion channel that is shown to regulate the input received by a postsynaptic cell and generate intrinsic graded persistent firing levels. The model provides a biophysical realization of Bayesian computation by reconciling several observed neural phenomena whose functional effects are only partially understood in concert.
Tully PJ, Hennig MH, Lansner A (2014) Synaptic and nonsynaptic plasticity approximating probabilistic inference. Frontiers in synaptic neuroscience 6:8.
Additional details
Related works
- Is derived from
- Publication: 10.5281/zenodo.5101626 (DOI)
- Is supplemented by
- Journal article: 10.3389/fnsyn.2014.00008 (DOI)