Skip to main content
bioRxiv
  • Home
  • About
  • Submit
  • ALERTS / RSS
Advanced Search
New Results

Convolutions are competitive with transformers for protein sequence pretraining

Kevin K. Yang, View ORCID ProfileAlex X. Lu, Nicolo Fusi
doi: https://doi.org/10.1101/2022.05.19.492714
Kevin K. Yang
1Microsoft Research New England,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: seinchin@gmail.com yang.kevin@microsoft.com
Alex X. Lu
2Microsoft Research New England,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Alex X. Lu
  • For correspondence: lualex@microsoft.com
Nicolo Fusi
3Microsoft Research New England,
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: fusi@microsoft.com
  • Abstract
  • Full Text
  • Info/History
  • Metrics
  • Preview PDF
Loading

Abstract

Pretrained protein sequence language models largely rely on the transformer architecture. However, transformer run-time and memory requirements scale quadrat-ically with sequence length. We investigate the potential of a convolution-based architecture for protein sequence masked language model pretraining and subsequent finetuning. CNNs are competitive on the pretraining task with transformers across several orders of magnitude in parameter size while scaling linearly with sequence length. More importantly, CNNs are competitive with and occasionally superior to transformers across an extensive set of downstream evaluations, including structure prediction, zero-shot mutation effect prediction, and out-of-domain generalization. We also demonstrate strong performance on sequences longer than the positional embeddings allowed in the current state-of-the-art transformer protein masked language models. Finally, we close with a call to disentangle the effects of pretraining task and model architecture when studying pretrained protein sequence models.

Competing Interest Statement

The authors have declared no competing interest.

Copyright 
The copyright holder has placed this preprint in the Public Domain. It is no longer restricted by copyright. Anyone can legally share, reuse, remix, or adapt this material for any purpose without crediting the original authors.
Back to top
PreviousNext
Posted May 20, 2022.
Download PDF
Email

Thank you for your interest in spreading the word about bioRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Enter multiple addresses on separate lines or separate them with commas.
Convolutions are competitive with transformers for protein sequence pretraining
(Your Name) has forwarded a page to you from bioRxiv
(Your Name) thought you would like to see this page from the bioRxiv website.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
Convolutions are competitive with transformers for protein sequence pretraining
Kevin K. Yang, Alex X. Lu, Nicolo Fusi
bioRxiv 2022.05.19.492714; doi: https://doi.org/10.1101/2022.05.19.492714
Digg logo Reddit logo Twitter logo Facebook logo Google logo LinkedIn logo Mendeley logo
Citation Tools
Convolutions are competitive with transformers for protein sequence pretraining
Kevin K. Yang, Alex X. Lu, Nicolo Fusi
bioRxiv 2022.05.19.492714; doi: https://doi.org/10.1101/2022.05.19.492714

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Bioengineering
Subject Areas
All Articles
  • Animal Behavior and Cognition (4091)
  • Biochemistry (8782)
  • Bioengineering (6490)
  • Bioinformatics (23375)
  • Biophysics (11761)
  • Cancer Biology (9163)
  • Cell Biology (13266)
  • Clinical Trials (138)
  • Developmental Biology (7419)
  • Ecology (11378)
  • Epidemiology (2066)
  • Evolutionary Biology (15099)
  • Genetics (10406)
  • Genomics (14017)
  • Immunology (9132)
  • Microbiology (22084)
  • Molecular Biology (8791)
  • Neuroscience (47415)
  • Paleontology (350)
  • Pathology (1421)
  • Pharmacology and Toxicology (2483)
  • Physiology (3708)
  • Plant Biology (8056)
  • Scientific Communication and Education (1433)
  • Synthetic Biology (2213)
  • Systems Biology (6019)
  • Zoology (1251)