RT Journal Article SR Electronic T1 Neural networks with optimized single-neuron adaptation uncover biologically plausible regularization JF bioRxiv FD Cold Spring Harbor Laboratory SP 2022.04.29.489963 DO 10.1101/2022.04.29.489963 A1 Geadah, Victor A1 Horoi, Stefan A1 Kerg, Giancarlo A1 Wolf, Guy A1 Lajoie, Guillaume YR 2023 UL http://biorxiv.org/content/early/2023/07/19/2022.04.29.489963.abstract AB Neurons in the brain have rich and adaptive input-output properties. Features such as heterogeneous f-I curves and spike frequency adaptation are known to place single neurons in optimal coding regimes when facing changing stimuli. Yet, it is still unclear how brain circuits exploit single-neuron flexibility, and how network-level requirements may have shaped such cellular function. To answer this question, a multi-scaled approach is needed where the computations of single neurons and neural circuits must be considered as a complete system. In this work, we use artificial neural networks to systematically investigate single-neuron input-output adaptive mechanisms, optimized in an end-to-end fashion. Throughout the optimization process, each neuron has the liberty to modify its nonlinear activation function, parametrized to mimic f-I curves of biological neurons, and to learn adaptation strategies to modify activation functions in real-time during a task. We find that such networks show much-improved robustness to noise and changes in input statistics. Importantly, we find that this procedure recovers precise coding strategies found in biological neurons, such as gain scaling and fractional order differentiation/integration. Using tools from dynamical systems theory, we analyze the role of these emergent single-neuron properties and argue that neural diversity and adaptation play an active regularization role, enabling neural circuits to optimally propagate information across time.Competing Interest StatementThe authors have declared no competing interest.