Abstract
Predictive coding has been identified as a key aspect of computation and learning in cortical microcircuits. But we do not know how synaptic plasticity processes install and maintain predictive coding capabilites in these neural circuits. Predictions are inherently uncertain, and learning rules that aim at discriminating linearly separable classes of inputs – such as the perceptron learning rule – do not perform well if the goal is learning to predict. We show that experimental data on synaptic plasticity in apical dendrites of pyramidal cells support another learning rule that is suitable for learning to predict. More precisely, it enables a spike-based approximation to logistic regression, a well-known gold standard for probabilistic prediction. We also show that data-based interactions between apical dendrites support learning of predictions for more complex probability distributions than those that can be handled by single dendrites. The resulting learning theory for top-down inputs to pyramidal cells provides a normative framework for evaluating experimental data, and suggests further experiments for tracking the emergence of predictive coding through synaptic plasticity in apical dendrites.
Competing Interest Statement
The authors have declared no competing interest.
Footnotes
↵* First authors
The results corresponding to the prediction of the moving object have been clarified to show how the target distributions are accurately predicted. Figure 3 has been correspondingly revised. The methods sections have been updated to include missing parameters and to specify the precise implementation of the learning rule using input spikes. Title has been changed.