1 Abstract
Studies at the intersection of neuroscience and machine learning have offered new insights to explain hierarchical learning in the neocortex. Two competing hypotheses have emerged: deep learninginspired approximations of the backpropagation algorithm, where neurons adjust synapses to minimize the error, and target learning algorithms, where neurons learn by reducing the feedback needed to achieve a desired activity. Despite decades of research and theoretical arguments supporting either possibility, there is currently no conclusive evidence for either hypothesis. We address this long-standing question by focusing on the relationship between synaptic plasticity and the somatic activity of pyramidal neurons. We first build a pyramidal neuron model integrating subcellular processes including calcium dynamics, backpropagating action potentials, and plateau potentials. Our model predicts that apical synaptic inputs drive basal synaptic plasticity through somatic depolarization caused by plateau potentials. We then test this prediction through in vitro electrophysiology experiments in which we co-stimulate apical and basal synapses to induce basal plasticity. These results allow us to derive distinct predictions for both the target learning and backpropagation hypotheses which we test on in vivo neuronal activity data from the mouse visual cortex. Our findings reveal that cortical learning is consistent with target learning, but not backpropagation, highlighting a critical discrepancy between deep learning and hierarchical learning in the neocortex.
Competing Interest Statement
The authors have declared no competing interest.
Footnotes
Changed the main text to improve the readability. Simplified figures.