Skip to main content
bioRxiv
  • Home
  • About
  • Submit
  • ALERTS / RSS
Advanced Search
New Results

MSA Transformer

View ORCID ProfileRoshan Rao, Jason Liu, Robert Verkuil, View ORCID ProfileJoshua Meier, View ORCID ProfileJohn F. Canny, Pieter Abbeel, View ORCID ProfileTom Sercu, View ORCID ProfileAlexander Rives
doi: https://doi.org/10.1101/2021.02.12.430858
Roshan Rao
1UC Berkeley
2Work performed during internship at FAIR
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Roshan Rao
  • For correspondence: [email protected] [email protected]
Jason Liu
3Facebook AI Research
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Robert Verkuil
3Facebook AI Research
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Joshua Meier
3Facebook AI Research
4New York University
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Joshua Meier
John F. Canny
1UC Berkeley
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for John F. Canny
Pieter Abbeel
1UC Berkeley
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Tom Sercu
3Facebook AI Research
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Tom Sercu
Alexander Rives
3Facebook AI Research
4New York University
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Alexander Rives
  • For correspondence: [email protected] [email protected]
  • Abstract
  • Full Text
  • Info/History
  • Metrics
  • Preview PDF
Loading

Abstract

Unsupervised protein language models trained across millions of diverse sequences learn structure and function of proteins. Protein language models studied to date have been trained to perform inference from individual sequences. The longstanding approach in computational biology has been to make inferences from a family of evo lutionarily related sequences by fitting a model to each family independently. In this work we combine the two paradigms. We introduce a protein language model which takes as input a set of sequences in the form of a multiple sequence alignment. The model interleaves row and column attention across the input sequences and is trained with a variant of the masked language modeling objective across many protein families. The performance of the model surpasses current state-of-the-art unsupervised structure learning methods by a wide margin, with far greater parameter efficiency than prior state-of-the-art protein language models.

Competing Interest Statement

The authors have declared no competing interest.

Footnotes

  • Code and weights available at https://github.com/facebookresearch/esm.

  • Added citation to Huang et al. 2019 (CCNet)

Copyright 
The copyright holder for this preprint is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license.
Back to top
PreviousNext
Posted August 27, 2021.
Download PDF
Email

Thank you for your interest in spreading the word about bioRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Enter multiple addresses on separate lines or separate them with commas.
MSA Transformer
(Your Name) has forwarded a page to you from bioRxiv
(Your Name) thought you would like to see this page from the bioRxiv website.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
MSA Transformer
Roshan Rao, Jason Liu, Robert Verkuil, Joshua Meier, John F. Canny, Pieter Abbeel, Tom Sercu, Alexander Rives
bioRxiv 2021.02.12.430858; doi: https://doi.org/10.1101/2021.02.12.430858
Twitter logo Facebook logo LinkedIn logo Mendeley logo
Citation Tools
MSA Transformer
Roshan Rao, Jason Liu, Robert Verkuil, Joshua Meier, John F. Canny, Pieter Abbeel, Tom Sercu, Alexander Rives
bioRxiv 2021.02.12.430858; doi: https://doi.org/10.1101/2021.02.12.430858

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Synthetic Biology
Subject Areas
All Articles
  • Animal Behavior and Cognition (6022)
  • Biochemistry (13708)
  • Bioengineering (10436)
  • Bioinformatics (33157)
  • Biophysics (17109)
  • Cancer Biology (14173)
  • Cell Biology (20106)
  • Clinical Trials (138)
  • Developmental Biology (10868)
  • Ecology (16018)
  • Epidemiology (2067)
  • Evolutionary Biology (20346)
  • Genetics (13395)
  • Genomics (18634)
  • Immunology (13750)
  • Microbiology (32164)
  • Molecular Biology (13392)
  • Neuroscience (70069)
  • Paleontology (526)
  • Pathology (2190)
  • Pharmacology and Toxicology (3741)
  • Physiology (5864)
  • Plant Biology (12020)
  • Scientific Communication and Education (1814)
  • Synthetic Biology (3367)
  • Systems Biology (8166)
  • Zoology (1841)