Abstract
We introduce a novel approach to study neurons as sophisticated I/O information processing units by utilizing recent advances in the field of machine learning. We trained deep neural networks (DNNs) to mimic the I/O behavior of a detailed nonlinear model of a layer 5 cortical pyramidal cell, receiving rich spatio-temporal patterns of input synapse activations. A Temporally Convolutional DNN (TCN) with seven layers was required to accurately, and very efficiently, capture the I/O of this neuron at the millisecond resolution. This complexity primarily arises from local NMDA-based nonlinear dendritic conductances. The weight matrices of the DNN provide new insights into the I/O function of cortical pyramidal neurons, and the approach presented can provide a systematic characterization of the functional complexity of different neuron types. Our results demonstrate that cortical neurons can be conceptualized as multi-layered “deep” processing units, implying that the cortical networks they form have a non-classical architecture and are potentially more computationally powerful than previously assumed.
Footnotes
Communication: David Beniaguev - david.beniaguev{at}gmail.com
The updated text contains a small narrative change, a more precise discussion section, several additional relevant citations were added, and 9 supplementary figures were also added. More importantly, links to code data and pretrained models were added. In the updated work we used larger and more diverse training and testing datasets, we trained the models longer that resulted in better performing models, and performed more rigorous hyper parameter sweep.