Abstract
Closing the gap between measurable genetic information and observable traits is a longstanding challenge in genomics. Yet, the prediction of molecular phenotypes from DNA sequences alone remains limited and inaccurate, often driven by the scarcity of annotated data and the inability to transfer learnings between prediction tasks. Here, we present an extensive study of foundation models pre-trained on DNA sequences, named the Nucleotide Transformer, integrating information from 3,202 diverse human genomes, as well as 850 genomes from a wide range of species, including model and non-model organisms. These transformer models yield transferable, context-specific representations of nucleotide sequences, which allow for accurate molecular phenotype prediction even in low-data settings. We show that the sequence representations alone match or outperform specialized methods on 12 of 18 prediction tasks, and up to 15 after fine-tuning. Despite no supervision, the transformer models learned to focus attention on key genomic elements, including those that regulate gene expression, such as enhancers. Lastly, we demonstrate that utilizing model representations can improve the prioritization of functional genetic variants. The training and application of foundational models in genomics explored in this study provide a widely applicable stepping stone to bridge the gap of accurate molecular phenotype prediction from DNA sequence. Code and weights available at: https://github.com/instadeepai/nucleotide-transformer.
Competing Interest Statement
The authors have declared no competing interest.
Footnotes
- added 10-folds validation for finetuning experiments - added DeepSEA and DeepSTAR experiments - corrected U-MAP analysis - improved figures - improved text
↵2 http://ftp.1000genomes.ebi.ac.uk/vol1/ftp/data_collections/1000G_2504_high_coverage/working/20201028_3202_phased/
↵4 https://jax.readthedocs.io/en/latest/_autosummary/jax.pmap.html
↵7 https://git.unistra.fr/nscalzitti/spliceator/-/tree/master/Data/Datasets
↵8 http://deepsea.princeton.edu/media/code/deepsea_train_bundle.v0.9.tar.gz