Skip to main content
bioRxiv
  • Home
  • About
  • Submit
  • ALERTS / RSS
Advanced Search
New Results

Hydrogen bonds meet self-attention: all you need for general-purpose protein structure embedding

Cheng Chen, Yuguo Zha, Daming Zhu, Kang Ning, Xuefeng Cui
doi: https://doi.org/10.1101/2021.01.31.428935
Cheng Chen
1School of Computer Science and Technology, Shandong University, Qingdao, 266237, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Yuguo Zha
2College of Life Science and Technology, Huazhong University of Science and Technology, Wuhan 430074, Hubei, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Daming Zhu
1School of Computer Science and Technology, Shandong University, Qingdao, 266237, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Kang Ning
2College of Life Science and Technology, Huazhong University of Science and Technology, Wuhan 430074, Hubei, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: ningkang@hust.edu.cn xfcui@email.sdu.edu.cn
Xuefeng Cui
1School of Computer Science and Technology, Shandong University, Qingdao, 266237, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: ningkang@hust.edu.cn xfcui@email.sdu.edu.cn
  • Abstract
  • Full Text
  • Info/History
  • Metrics
  • Preview PDF
Loading

Abstract

General-purpose protein structure embedding can be used for many important protein biology tasks, such as protein design, drug design and binding affinity prediction. Recent researches have shown that attention-based encoder layers are more suitable to learn high-level features. Based on this key observation, we treat low-level representation learning and high-level representation learning separately, and propose a two-level general-purpose protein structure embedding neural network, called ContactLib-ATT. On the local embedding level, a simple yet meaningful hydrogen-bond representation is learned. On the global embedding level, attention-based encoder layers are employed for global representation learning. In our experiments, ContactLib-ATT achieves a SCOP superfamily classification accuracy of 82.4% (i.e., 6.7% higher than state-of-the-art method) on the SCOP40 2.07 dataset. Moreover, ContactLib-ATT is demonstrated to successfully simulate a structure-based search engine for remote homologous proteins, and our top-10 candidate list contains at least one remote homolog with a probability of 91.9%. Source codes: https://github.com/xfcui/contactlib.

Competing Interest Statement

The authors have declared no competing interest.

Copyright 
The copyright holder for this preprint is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. All rights reserved. No reuse allowed without permission.
Back to top
PreviousNext
Posted August 29, 2021.
Download PDF
Email

Thank you for your interest in spreading the word about bioRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Enter multiple addresses on separate lines or separate them with commas.
Hydrogen bonds meet self-attention: all you need for general-purpose protein structure embedding
(Your Name) has forwarded a page to you from bioRxiv
(Your Name) thought you would like to see this page from the bioRxiv website.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
Hydrogen bonds meet self-attention: all you need for general-purpose protein structure embedding
Cheng Chen, Yuguo Zha, Daming Zhu, Kang Ning, Xuefeng Cui
bioRxiv 2021.01.31.428935; doi: https://doi.org/10.1101/2021.01.31.428935
Digg logo Reddit logo Twitter logo Facebook logo Google logo LinkedIn logo Mendeley logo
Citation Tools
Hydrogen bonds meet self-attention: all you need for general-purpose protein structure embedding
Cheng Chen, Yuguo Zha, Daming Zhu, Kang Ning, Xuefeng Cui
bioRxiv 2021.01.31.428935; doi: https://doi.org/10.1101/2021.01.31.428935

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Bioinformatics
Subject Areas
All Articles
  • Animal Behavior and Cognition (3586)
  • Biochemistry (7544)
  • Bioengineering (5495)
  • Bioinformatics (20729)
  • Biophysics (10294)
  • Cancer Biology (7950)
  • Cell Biology (11609)
  • Clinical Trials (138)
  • Developmental Biology (6586)
  • Ecology (10166)
  • Epidemiology (2065)
  • Evolutionary Biology (13578)
  • Genetics (9519)
  • Genomics (12816)
  • Immunology (7905)
  • Microbiology (19503)
  • Molecular Biology (7640)
  • Neuroscience (41980)
  • Paleontology (307)
  • Pathology (1254)
  • Pharmacology and Toxicology (2192)
  • Physiology (3259)
  • Plant Biology (7018)
  • Scientific Communication and Education (1293)
  • Synthetic Biology (1947)
  • Systems Biology (5418)
  • Zoology (1113)