Skip to main content
bioRxiv
  • Home
  • About
  • Submit
  • ALERTS / RSS
Advanced Search
New Results

Short-term Hebbian learning can implement transformer-like attention

Ian T. Ellwood
doi: https://doi.org/10.1101/2023.05.31.543109
Ian T. Ellwood
1Department of Neurobiology and Behavior, Cornell University, Ithaca, NY, 14853, USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: ite2@cornell.edu
  • Abstract
  • Full Text
  • Info/History
  • Metrics
  • Data/Code
  • Preview PDF
Loading

Abstract

Transformers have revolutionized machine learning models of language and vision, but their connection with neuroscience remains tenuous. Built from attention layers, they require a mass comparison of queries and keys that is difficult to perform using traditional neural circuits. Here, we show that neurons can implement attention-like computations using short-term, Hebbian synaptic potentiation. We call our mechanism the match-and-control principle and it proposes that when activity in an axon is synchronous, or matched, with the somatic activity of a neuron that it synapses onto, the synapse can be briefly strongly potentiated, allowing the axon to take over, or control, the activity of the downstream neuron for a short time. In our scheme, the keys and queries are represented as spike trains and comparisons between the two are performed in individual spines allowing for hundreds of key comparisons per query and roughly as many keys and queries as there are neurons in the network.

Competing Interest Statement

The authors have declared no competing interest.

Footnotes

  • https://github.com/iellwood/MatchAndControlPaper

Copyright 
The copyright holder for this preprint is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license.
Back to top
PreviousNext
Posted June 04, 2023.
Download PDF
Data/Code
Email

Thank you for your interest in spreading the word about bioRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Enter multiple addresses on separate lines or separate them with commas.
Short-term Hebbian learning can implement transformer-like attention
(Your Name) has forwarded a page to you from bioRxiv
(Your Name) thought you would like to see this page from the bioRxiv website.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
Short-term Hebbian learning can implement transformer-like attention
Ian T. Ellwood
bioRxiv 2023.05.31.543109; doi: https://doi.org/10.1101/2023.05.31.543109
Reddit logo Twitter logo Facebook logo LinkedIn logo Mendeley logo
Citation Tools
Short-term Hebbian learning can implement transformer-like attention
Ian T. Ellwood
bioRxiv 2023.05.31.543109; doi: https://doi.org/10.1101/2023.05.31.543109

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Neuroscience
Subject Areas
All Articles
  • Animal Behavior and Cognition (4658)
  • Biochemistry (10310)
  • Bioengineering (7629)
  • Bioinformatics (26220)
  • Biophysics (13463)
  • Cancer Biology (10638)
  • Cell Biology (15357)
  • Clinical Trials (138)
  • Developmental Biology (8461)
  • Ecology (12768)
  • Epidemiology (2067)
  • Evolutionary Biology (16782)
  • Genetics (11368)
  • Genomics (15420)
  • Immunology (10565)
  • Microbiology (25078)
  • Molecular Biology (10170)
  • Neuroscience (54206)
  • Paleontology (398)
  • Pathology (1659)
  • Pharmacology and Toxicology (2878)
  • Physiology (4320)
  • Plant Biology (9206)
  • Scientific Communication and Education (1582)
  • Synthetic Biology (2543)
  • Systems Biology (6759)
  • Zoology (1455)