Abstract
Many neural computations emerge from self-sustained patterns of activity in recurrent neural circuits, which rely on balanced excitation and inhibition. Neuromorphic electronic circuits that use the physics of silicon to emulate neuronal dynamics represent a promising approach for implementing the brain’s computational primitives, including self-sustained neural activity. However, achieving the same robustness of biological networks in neuromorphic computing systems remains a challenge, due to the high degree of heterogeneity and variability of their analog components.
Inspired by the strategies used by real cortical networks, we apply a biologically-plausible cross-homeostatic learning rule to balance excitation and inhibition in neuromorphic implementations of spiking recurrent neural networks. We demonstrate how this learning rule allows the neuromorphic system to work in the presence of device mismatch and to autonomously tune the spiking network to produce robust, self-sustained, fixed-point attractor dynamics with irregular spiking in an inhibition-stabilized regime. We show that this rule can implement multiple, coexisting stable memories, with emergent soft-winner-take-all (sWTA) dynamics, and reproduce the so-called “paradoxical effect” widely observed in cortical circuits. In addition to validating neuroscience models on a substrate that shares many similar properties and limitations with biological systems, this work enables the construction of ultra-low power, mixed-signal neuromorphic technologies that can be automatically configured to compute reliably, despite the large on-chip and chip-to-chip variability of their analog components.
Competing Interest Statement
The authors have declared no competing interest.
Footnotes
New results on learning rule and soft winner take all. We implemented Mackwood,Naumann Sprekeler (2021) rule and present results on mixed-signal neuromorphic. We also show how the network has sWTA properties without additional tuning for inhibition