Skip to main content
bioRxiv
  • Home
  • About
  • Submit
  • ALERTS / RSS
Advanced Search
New Results

Sparse RNNs can support high-capacity classification

View ORCID ProfileDenis Turcu, L. F. Abbott
doi: https://doi.org/10.1101/2022.05.18.492540
Denis Turcu
The Mortimer B. Zuckerman Mind, Brain and Behavior Institute, Department of Neuroscience, Department of Physiology and Cellular Biophysics, Columbia University, New York NY 10027 USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Denis Turcu
  • For correspondence: dt2626@cumc.columbia.edu
L. F. Abbott
The Mortimer B. Zuckerman Mind, Brain and Behavior Institute, Department of Neuroscience, Department of Physiology and Cellular Biophysics, Columbia University, New York NY 10027 USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Abstract
  • Full Text
  • Info/History
  • Metrics
  • Data/Code
  • Preview PDF
Loading

Abstract

Feedforward network models performing classification tasks rely on highly convergent output units that collect the information passed on by preceding layers. Although convergent output-unit like neurons may exist in some biological neural circuits, notably the cerebellar cortex, neocortical circuits do not exhibit any obvious candidates for this role; instead they are highly recurrent. We investigate whether a sparsely connected recurrent neural network (RNN) can perform classification in a distributed manner without ever bringing all of the relevant information to a single convergence site. Our model is based on a sparse RNN that performs classification dynamically. Specifically, the interconnections of the RNN are trained to resonantly amplify the magnitude of responses to some external inputs but not others. The amplified and non-amplified responses then form the basis for binary classification. Furthermore, the network acts as an evidence accumulator and maintains its decision even after the input is turned off. Despite highly sparse connectivity, learned recurrent connections allow input information to flow to every neuron of the RNN, providing the basis for distributed computation. In this arrangement, the minimum number of synapses per neuron required to reach maximum memory capacity scales only logarithmically with network size. The model is robust to various types of noise, works with different activation and loss functions and with both backpropagation- and Hebbian-based learning rules. The RNN can also be constructed with a split excitation-inhibition architecture with little reduction in performance.

Competing Interest Statement

The authors have declared no competing interest.

Footnotes

  • https://github.com/DenisTurcu/SparseRNN

Copyright 
The copyright holder for this preprint is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY 4.0 International license.
Back to top
PreviousNext
Posted May 19, 2022.
Download PDF
Data/Code
Email

Thank you for your interest in spreading the word about bioRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Enter multiple addresses on separate lines or separate them with commas.
Sparse RNNs can support high-capacity classification
(Your Name) has forwarded a page to you from bioRxiv
(Your Name) thought you would like to see this page from the bioRxiv website.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
Sparse RNNs can support high-capacity classification
Denis Turcu, L. F. Abbott
bioRxiv 2022.05.18.492540; doi: https://doi.org/10.1101/2022.05.18.492540
Reddit logo Twitter logo Facebook logo LinkedIn logo Mendeley logo
Citation Tools
Sparse RNNs can support high-capacity classification
Denis Turcu, L. F. Abbott
bioRxiv 2022.05.18.492540; doi: https://doi.org/10.1101/2022.05.18.492540

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Neuroscience
Subject Areas
All Articles
  • Animal Behavior and Cognition (4246)
  • Biochemistry (9184)
  • Bioengineering (6808)
  • Bioinformatics (24072)
  • Biophysics (12167)
  • Cancer Biology (9570)
  • Cell Biology (13847)
  • Clinical Trials (138)
  • Developmental Biology (7666)
  • Ecology (11742)
  • Epidemiology (2066)
  • Evolutionary Biology (15548)
  • Genetics (10676)
  • Genomics (14372)
  • Immunology (9523)
  • Microbiology (22923)
  • Molecular Biology (9140)
  • Neuroscience (49175)
  • Paleontology (358)
  • Pathology (1488)
  • Pharmacology and Toxicology (2584)
  • Physiology (3851)
  • Plant Biology (8356)
  • Scientific Communication and Education (1473)
  • Synthetic Biology (2302)
  • Systems Biology (6207)
  • Zoology (1304)