Skip to main content
bioRxiv
  • Home
  • About
  • Submit
  • ALERTS / RSS
Advanced Search
New Results

Biological Structure and Function Emerge from Scaling Unsupervised Learning to 250 Million Protein Sequences

Alexander Rives, Siddharth Goyal, Joshua Meier, Demi Guo, Myle Ott, C. Lawrence Zitnick, Jerry Ma, Rob Fergus
doi: https://doi.org/10.1101/622803
Alexander Rives
‡Dept. of Computer Science, New York University, USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: arives@cs.nyu.edu maj@fb.com robfergus@fb.com
Siddharth Goyal
§Facebook AI Research, USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Joshua Meier
§Facebook AI Research, USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Demi Guo
§Facebook AI Research, USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Myle Ott
§Facebook AI Research, USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
C. Lawrence Zitnick
§Facebook AI Research, USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Jerry Ma
§Facebook AI Research, USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: arives@cs.nyu.edu maj@fb.com robfergus@fb.com
Rob Fergus
‡Dept. of Computer Science, New York University, USA
§Facebook AI Research, USA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: arives@cs.nyu.edu maj@fb.com robfergus@fb.com
  • Abstract
  • Full Text
  • Info/History
  • Metrics
  • Preview PDF
Loading

Abstract

In the field of artificial intelligence, a combination of scale in data and model capacity enabled by unsupervised learning has led to major advances in representation learning and statistical generation. In biology, the anticipated growth of sequencing promises unprecedented data on natural sequence diversity. Learning the natural distribution of evolutionary protein sequence variation is a logical step toward predictive and generative modeling for biology. To this end we use unsupervised learning to train a deep contextual language model on 86 billion amino acids across 250 million sequences spanning evolutionary diversity. The resulting model maps raw sequences to representations of biological properties without labels or prior domain knowledge. The learned representation space organizes sequences at multiple levels of biological granularity from the biochemical to proteomic levels. Learning recovers information about protein structure: secondary structure and residue-residue contacts can be extracted by linear projections from learned representations. With small amounts of labeled data, the ability to identify tertiary contacts is further improved. Learning on full sequence diversity rather than individual protein families increases recoverable information about secondary structure. We show the networks generalize by adapting them to variant activity prediction from sequences only, with results that are comparable to a state-of-the-art variant predictor that uses evolutionary and structurally derived features.

Copyright 
The copyright holder for this preprint is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license.
Back to top
PreviousNext
Posted April 29, 2019.
Download PDF
Email

Thank you for your interest in spreading the word about bioRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Enter multiple addresses on separate lines or separate them with commas.
Biological Structure and Function Emerge from Scaling Unsupervised Learning to 250 Million Protein Sequences
(Your Name) has forwarded a page to you from bioRxiv
(Your Name) thought you would like to see this page from the bioRxiv website.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
Biological Structure and Function Emerge from Scaling Unsupervised Learning to 250 Million Protein Sequences
Alexander Rives, Siddharth Goyal, Joshua Meier, Demi Guo, Myle Ott, C. Lawrence Zitnick, Jerry Ma, Rob Fergus
bioRxiv 622803; doi: https://doi.org/10.1101/622803
Digg logo Reddit logo Twitter logo Facebook logo Google logo LinkedIn logo Mendeley logo
Citation Tools
Biological Structure and Function Emerge from Scaling Unsupervised Learning to 250 Million Protein Sequences
Alexander Rives, Siddharth Goyal, Joshua Meier, Demi Guo, Myle Ott, C. Lawrence Zitnick, Jerry Ma, Rob Fergus
bioRxiv 622803; doi: https://doi.org/10.1101/622803

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Synthetic Biology
Subject Areas
All Articles
  • Animal Behavior and Cognition (4087)
  • Biochemistry (8766)
  • Bioengineering (6480)
  • Bioinformatics (23346)
  • Biophysics (11751)
  • Cancer Biology (9149)
  • Cell Biology (13255)
  • Clinical Trials (138)
  • Developmental Biology (7417)
  • Ecology (11369)
  • Epidemiology (2066)
  • Evolutionary Biology (15088)
  • Genetics (10402)
  • Genomics (14011)
  • Immunology (9122)
  • Microbiology (22050)
  • Molecular Biology (8780)
  • Neuroscience (47373)
  • Paleontology (350)
  • Pathology (1420)
  • Pharmacology and Toxicology (2482)
  • Physiology (3704)
  • Plant Biology (8050)
  • Scientific Communication and Education (1431)
  • Synthetic Biology (2209)
  • Systems Biology (6016)
  • Zoology (1250)