RT Journal Article SR Electronic T1 The combination of Hebbian and predictive plasticity learns invariant object representations in deep sensory networks JF bioRxiv FD Cold Spring Harbor Laboratory SP 2022.03.17.484712 DO 10.1101/2022.03.17.484712 A1 Manu Srinath Halvagal A1 Friedemann Zenke YR 2022 UL http://biorxiv.org/content/early/2022/03/19/2022.03.17.484712.abstract AB Discriminating distinct objects and concepts from sensory stimuli is essential for survival. Our brains accomplish this feat by forming meaningful internal representations in deep sensory networks with plastic synaptic connections. Experience-dependent plasticity presumably exploits temporal contingencies between sensory inputs to build these internal representations. However, the precise mechanisms underlying plasticity remain elusive. We derive a local synaptic plasticity model inspired by self-supervised machine learning techniques that shares a deep conceptual connection to Bienenstock-Cooper-Munro (BCM) theory and is consistent with experimentally observed plasticity rules. We show that our plasticity model yields disentangled object representations in deep neural networks without the need for supervision and implausible negative examples. In response to altered visual experience, our model qualitatively captures neuronal selectivity changes observed in the monkey inferotemporal cortex in-vivo. Our work suggests a plausible learning rule to drive learning in sensory networks while making concrete testable predictions.Competing Interest StatementThe authors have declared no competing interest.