PT - JOURNAL ARTICLE AU - Ryan Golden AU - Jean Erik Delanois AU - Pavel Sanda AU - Maxim Bazhenov TI - Sleep prevents catastrophic forgetting in spiking neural networks by forming joint synaptic weight representations AID - 10.1101/688622 DP - 2020 Jan 01 TA - bioRxiv PG - 688622 4099 - http://biorxiv.org/content/early/2020/06/10/688622.short 4100 - http://biorxiv.org/content/early/2020/06/10/688622.full AB - Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting. In contrast, the brain learns continuously, and typically learns best when new learning is interleaved with periods of sleep for memory consolidation. In this study, we used spiking network to study mechanisms behind catastrophic forgetting and the role of sleep in preventing it. The network could be trained to learn a complex foraging task but exhibited catastrophic forgetting when trained sequentially on multiple tasks. New task training moved the synaptic weight configuration away from the manifold representing old tasks leading to forgetting. Interleaving new task training with periods of off-line reactivation, mimicking biological sleep, mitigated catastrophic forgetting by pushing the synaptic weight configuration towards the intersection of the solution manifolds representing multiple tasks. The study reveals a possible strategy of synaptic weights dynamics the brain applies during sleep to prevent forgetting and optimize learning.Competing Interest StatementThe authors have declared no competing interest.