Skip to main content
bioRxiv
  • Home
  • About
  • Submit
  • ALERTS / RSS
Advanced Search
New Results

EpiGePT: a Pretrained Transformer model for epigenomics

View ORCID ProfileZijing Gao, View ORCID ProfileQiao Liu, Wanwen Zeng, Wing Hung Wong, Rui Jiang
doi: https://doi.org/10.1101/2023.07.15.549134
Zijing Gao
1Ministry of Education Key Laboratory of Bioinformatics, Research Department of Bioinformatics at the Beijing National Research Center for Information Science and Technology, Center for Synthetic and Systems Biology, Department of Automation, Tsinghua University, Beijing 100084, China;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Zijing Gao
Qiao Liu
2Department of Statistics, Stanford University, Stanford, CA 94305, USA;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Qiao Liu
Wanwen Zeng
2Department of Statistics, Stanford University, Stanford, CA 94305, USA;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Wing Hung Wong
2Department of Statistics, Stanford University, Stanford, CA 94305, USA;
3Department of Biomedical Data Science, Bio-X Program, Center for Personal Dynamic Regulomes, Stanford University, Stanford, CA 94305, USA;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: liuqiao@stanford.edu whwong@stanford.edu ruijiang@tsinghua.edu.cn
Rui Jiang
1Ministry of Education Key Laboratory of Bioinformatics, Research Department of Bioinformatics at the Beijing National Research Center for Information Science and Technology, Center for Synthetic and Systems Biology, Department of Automation, Tsinghua University, Beijing 100084, China;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: liuqiao@stanford.edu whwong@stanford.edu ruijiang@tsinghua.edu.cn
  • Abstract
  • Full Text
  • Info/History
  • Metrics
  • Supplementary material
  • Preview PDF
Loading

Abstract

The transformer-based models, such as GPT-31 and DALL-E2, have achieved unprecedented breakthroughs in the field of natural language processing and computer vision. The inherent similarities between natural language and biological sequences have prompted a new wave of inferring the grammatical rules underneath the biological sequences. In genomic study, it is worth noting that DNA sequences alone cannot explain all the gene activities due to epigenetic mechanism. To investigate this problem, we propose EpiGePT, a new transformer-based language pretrained model in epigenomics, for predicting genome-wide epigenomic signals by considering the mechanistic modeling of transcriptional regulation. Specifically, EpiGePT takes the context-specific activities of transcription factors (TFs) into consideration, which could offer deeper biological insights comparing to models trained on DNA sequence only. In a series of experiments, EpiGePT demonstrates state-of-the-art performance in a diverse epigenomic signals prediction tasks as well as new prediction tasks by fine-tuning. Furthermore, EpiGePT is capable of learning the cell-type-specific long-range interactions through the self-attention mechanism and interpreting the genetic variants that associated with human diseases. We expect that the advances of EpiGePT can shed light on understanding the complex regulatory mechanisms in gene regulation. We provide free online prediction service of EpiGePT through https://health.tsinghua.edu.cn/epigept/.

Competing Interest Statement

The authors have declared no competing interest.

Copyright 
The copyright holder for this preprint is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license.
Back to top
PreviousNext
Posted July 18, 2023.
Download PDF

Supplementary Material

Email

Thank you for your interest in spreading the word about bioRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Enter multiple addresses on separate lines or separate them with commas.
EpiGePT: a Pretrained Transformer model for epigenomics
(Your Name) has forwarded a page to you from bioRxiv
(Your Name) thought you would like to see this page from the bioRxiv website.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
EpiGePT: a Pretrained Transformer model for epigenomics
Zijing Gao, Qiao Liu, Wanwen Zeng, Wing Hung Wong, Rui Jiang
bioRxiv 2023.07.15.549134; doi: https://doi.org/10.1101/2023.07.15.549134
Reddit logo Twitter logo Facebook logo LinkedIn logo Mendeley logo
Citation Tools
EpiGePT: a Pretrained Transformer model for epigenomics
Zijing Gao, Qiao Liu, Wanwen Zeng, Wing Hung Wong, Rui Jiang
bioRxiv 2023.07.15.549134; doi: https://doi.org/10.1101/2023.07.15.549134

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Genomics
Subject Areas
All Articles
  • Animal Behavior and Cognition (4659)
  • Biochemistry (10313)
  • Bioengineering (7641)
  • Bioinformatics (26246)
  • Biophysics (13481)
  • Cancer Biology (10650)
  • Cell Biology (15366)
  • Clinical Trials (138)
  • Developmental Biology (8468)
  • Ecology (12778)
  • Epidemiology (2067)
  • Evolutionary Biology (16795)
  • Genetics (11373)
  • Genomics (15431)
  • Immunology (10582)
  • Microbiology (25087)
  • Molecular Biology (10172)
  • Neuroscience (54239)
  • Paleontology (398)
  • Pathology (1660)
  • Pharmacology and Toxicology (2884)
  • Physiology (4328)
  • Plant Biology (9214)
  • Scientific Communication and Education (1582)
  • Synthetic Biology (2545)
  • Systems Biology (6762)
  • Zoology (1459)